Traitement en cours

Veuillez attendre...

Paramétrages

Paramétrages

Aller à Demande

1. WO2020192451 - PROCÉDÉ ET APPAREIL DE PROJECTION D'IMAGE EN TEMPS RÉEL POUR ÉCRAN DE LUNETTES À RA, CONTRÔLEUR ET SUPPORT

Document

Description

Title of Invention 0001   0002   0003   0004   0005   0006   0007   0008   0009   0010   0011   0012   0013   0014   0015   0016   0017   0018   0019   0020   0021   0022   0023   0024   0025   0026   0027   0028   0029   0030   0031   0032   0033   0034   0035   0036   0037   0038   0039   0040   0041   0042   0043   0044   0045   0046   0047   0048   0049   0050   0051   0052   0053   0054   0055   0056   0057   0058   0059   0060   0061   0062   0063   0064   0065   0066   0067   0068   0069   0070   0071   0072   0073   0074   0075   0076   0077   0078   0079   0080   0081   0082   0083   0084   0085   0086   0087   0088   0089   0090   0091   0092   0093   0094   0095   0096   0097   0098   0099   0100   0101   0102   0103   0104   0105   0106   0107   0108   0109   0110   0111   0112   0113   0114   0115   0116   0117   0118   0119   0120   0121   0122   0123   0124   0125   0126   0127   0128   0129   0130   0131   0132   0133   0134   0135   0136   0137   0138   0139   0140   0141   0142   0143   0144   0145   0146   0147   0148   0149   0150   0151   0152   0153   0154   0155   0156   0157   0158   0159   0160   0161   0162   0163  

Claims

1   2   3   4   5   6   7   8   9   10   11   12   13   14   15   16   17   18  

Drawings

1   2   3   4   5  

Description

Title of Invention : REAL-TIME PICTURE PROJECTION METHOD AND APPARATUS OF AR GLASSES SCREEN, CONTROLLER AND MEDIUM

Field of the Invention

[0001]
The present invention relates to the technical field of augmented reality, and in particular to a real-time picture projection method and apparatus of an AR glasses screen, a controller and a medium.

Background of the Invention

[0002]
In recent years, the augmented reality (Augmented Reality, referred to as AR) technology has developed rapidly, new AR products are constantly introduced, however, most of the products are oriented toward professional or high-end users, many people are difficult to understand or contacting the AR products, therefore, AR glasses can be used for projecting contents presented on a screen, that is, a real-time picture on an AR glasses screen, to an extended display to achieve the promotion of the AR technology and products such as AR glasses, and the extended display is an external display connected to the AR glasses.
[0003]
The contents displayed in the AR glasses screen is the superposition of a virtual user interface (UI) on the glasses screen and the contents of the actual field of view of a glasses user, the AR glasses screen is often transparent or semitransparent, the glasses screen is only responsible for displaying corresponding virtual UI contents, the actual field of view scenario is used as the background of the virtual UI, therefore, a final result screen projected by the AR glasses is a synthetic result picture of a real-time scenario picture captured by an AR glasses camera and the virtual UI presented on the glasses screen, wherein the real-time scenario picture captured by the AR glasses camera is the real field of view contents of the glasses user, which is referred to as a camera picture.
[0004]
The prior art is generally implemented by the following two technical solutions:
[0005]
(1) Screen mirror image technology: the camera picture of Android AR glasses is imported into a main display of the glasses and is displayed in the main display of the AR glasses together with the virtual UI contents, then the result picture is output to the extended display through a screen mirror image, and the Android AR glasses refers to AR glasses equipped with an Android system. However, if the real-time actual picture of the AR glasses is projected to the extended display just via the single screen mirror image, the AR glasses user sees the picture from the glasses screen while projecting the picture on the glasses screen, and meanwhile, the virtual UI of the AR glasses cannot be accurately attached in a real scenario, in this way, the the user experience is poor.
[0006]
(2) Background server synthesis technology: the AR glasses send two groups of video streams to a background video stream server through the video stream technology at the same time, one group is from the camera picture of AR glasses, and one group is from the screen mirror image picture of the virtual UI on the glasses screen. The two groups of video streams are combined into a picture in the background video stream server, and the picture is sent to the extended display for display. The technical means can avoid the degradation of the user experience of the AR glasses caused by the single screen mirror image. However, the synthesis of the video stream picture needs the support from the background video stream server, the synthesis of the video stream picture is also delayed, the result screen can only be displayed in the extended display by decoding the video streams via a terminal with an independent computing unit when the result screen is displayed, the technical solution depends on an intermediate device, at the same time, there is an additional burden on the development and deployment of related supporting software, the cost is high, it is not suitable for the rapid deployment of lightweight localization, and in addition, most video stream services rely on the support of a public network and a wireless network, which is very limited for local small local area networks, point-to-point Wi-Fi directs, and even wired transmission.
[0007]
Summary of the Invention
[0008]
The purpose of the present invention is to provide a real-time picture projection method and apparatus of an AR glasses screen, a controller and a medium. Through the differentiated content display of the AR glasses screen and an extended display, a glasses main screen neither needs to display a camera picture, nor needs to set up an additional server or the like to achieve projection, thereby reducing the development cost, realizing rapid deployment, and improving user experience.
[0009]
In order to solve the above technical problem, according to a first embodiment of the present invention, a real-time picture projection method of an AR glasses screen is provided, including:
[0010]
detecting whether AR glasses are connected with an external extended display;
[0011]
if yes, collecting a camera picture through a camera disposed on the AR glasses, and collecting a virtual UI through a main screen of the AR glasses;
[0012]
performing picture synthesis based on the camera picture and the virtual UI to generate a synthetic result picture, and sending the synthetic result picture to the extended display; and
[0013]
performing differentiated content display on the extended display and the main screen of the AR glasses.
[0014]
Further, the detecting whether AR glasses are connected with an external extended display includes:
[0015]
detecting whether the AR glasses are connected with the external extended display through a first interface, wherein the first interface includes a display manager interface and a media router interface.
[0016]
Further, the performing picture synthesis based on the camera picture and the virtual UI includes:
[0017]
defining a second interface subclass, wherein the second interface includes a presentation interface;
[0018]
associating a map layer with the second interface within an initial creation cycle of the life cycle of the second interface;
[0019]
presenting the camera picture at the bottom of the map layer of the second interface; and
[0020]
synchronously presenting the virtual UI in the map layer of the second interface, and overlapping the camera picture and the virtual UI to generate the synthetic result picture.
[0021]
Further, the synchronously presenting the virtual UI in the map layer of the second interface includes:
[0022]
obtaining a drawing cache of the virtual UI, wherein the drawing cache is a full-screen virtual UI drawing cache or a partial virtual UI drawing cache;
[0023]
generating a bitmap based on the drawing cache; and
[0024]
creating a new view extension subclass, defining the new view extension subclass as a drawing board, declaring that a background color drawn on the canvas is transparent, drawing the bitmap in the drawing board, adding the bitmap to the map layer of the second interface, and superposing the map layer of the second interface on the camera picture.
[0025]
Further, when the drawing cache to be obtained is the partial virtual UI drawing cache,
[0026]
constructing an extension subclass of a view or frame layout to serve as a view container;
[0027]
monitoring a callback event, obtaining the corresponding partial virtual UI drawing cache when the view changes, and placing the partial virtual UI drawing cache in the view container; and
[0028]
generating the bitmap based on the partial virtual UI drawing cache in the view container, and drawing the bitmap on the drawing board.
[0029]
Further, when the screen resolution of the AR glasses and the resolution of the extended display are inconsistent, the method further includes:
[0030]
setting the height and the width of the view container to be consistent with the display width and the display height of the extended display;
[0031]
obtaining the actual display height and the actual display width of the view container in the AR glasses screen;
[0032]
obtaining the display height and the display width of the extended display in the drawing board;
[0033]
dividing the display height of the extended display by the actual display height of the view container in the AR glasses screen to obtain a height multiple;
[0034]
dividing the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple; and
[0035]
performing equal ratio scaling on the drawing board according to the height multiple and the width multiple.
[0036]
Further, the performing differentiated content display on the extended display and the main screen of the AR glasses includes:
[0037]
binding the defined second interface subclass with the extended display detected through the first interface; and
[0038]
displaying the synthetic result screen in the extended display, and displaying the virtual UI on the main screen of the AR glasses at the same time.
[0039]
Further, the AR glasses are in communication connection with the extended display via wired, wireless Wi-Fi, Wi-Fi direct, Google cast, wireless Bluetooth, GSM, CDMA, a local area network, or the Internet.
[0040]
According to a second embodiment of the present invention, a real-time picture projection apparatus of an AR glasses screen is provided, including:
[0041]
a display detection module, configured to detect whether AR glasses are connected with an external extended display;
[0042]
an image collection module configured to, when the AR glasses are connected with the external extended display, collect a camera picture through a camera disposed on the AR glasses, and collect a virtual UI through a main screen of the AR glasses;
[0043]
a picture synthesis module, configured to perform picture synthesis based on the camera picture and the virtual UI to generate a synthetic result picture, and send the synthetic result picture to the extended display; and
[0044]
a differentiated display module, configured to perform differentiated content display on the extended display and the main screen of the AR glasses.
[0045]
Further, the display detection module is specifically configured to:
[0046]
detect whether the AR glasses are connected with the external extended display through a first interface, wherein the first interface includes a display manager interface and a media router interface.
[0047]
Further, the picture synthesis module includes:
[0048]
an interface definition sub-module, configured to define a second interface subclass, wherein the second interface includes a presentation interface;
[0049]
a map layer association sub-module, configured to associate a map layer for the second interface within an initial creation cycle of the life cycle of the second interface;
[0050]
a first picture presenting sub-module, configured to present the camera picture at the bottom of the map layer of the second interface; and
[0051]
a second picture presenting sub-module, configured to synchronously present the virtual UI in the map layer of the second interface, and overlappingthe camera picture and the virtual UI to generate the synthetic result picture.
[0052]
Further, the second picture presenting sub-module includes:
[0053]
a drawing cache obtaining unit, configured to obtain a drawing cache of the virtual UI, wherein the drawing cache is a full-screen virtual UI drawing cache or a partial virtual UI drawing cache;
[0054]
a bitmap generation unit, configured to generate a bitmap based on the drawing cache; and
[0055]
a bitmap superposition unit, configured to create a new view extension subclass, define the new view extension subclass as a drawing board, declare that a background color drawn on the canvas is transparent, draw the bitmap in the drawing board, add the bitmap to the map layer of the second interface, and superpose the map layer of the second interface on the camera picture.
[0056]
Further, when the drawing cache is the partial virtual UI drawing cache, the second picture presenting sub-module further includes:
[0057]
a container construction sub-unit, configured to construct an extension subclass of a view or frame layout to serve as a view container;
[0058]
a container storage sub-unit, configured to monitor a callback event, obtain the corresponding partial virtual UI drawing cache when the view changes, and place the partial virtual UI drawing cache in the view container; and
[0059]
a drawing sub-unit, configured to generate the bitmap based on the partial virtual UI drawing cache in the view container, and draw the bitmap on the drawing board.
[0060]
Further, when the screen resolution of the AR glasses and the resolution of the extended display are inconsistent, the apparatus further includes:
[0061]
a parameter setting unit, configured to set the height and the width of the view container to be consistent with the display width and the display height of the extended display;
[0062]
a first parameter obtaining unit, configured to obtain the actual display height and the actual display width of the view container in the AR glasses screen;
[0063]
a second parameter obtaining unit, configured to obtain the display height and the display width of the extended display in the drawing board;
[0064]
a height multiple determining unit, configured to divide the display height of the extended display by the actual display height of the view container in the AR glasses screen to obtain a height multiple;
[0065]
a width multiple determining module, configured to divide the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple; and
[0066]
an equal ratio scaling unit, configured to perform equal ratio scaling on the drawing board according to the height multiple and the width multiple.
[0067]
Further, the differentiated display module includes:
[0068]
a binding unit, configured to bind the defined second interface subclass with the extended display detected through the first interface; and
[0069]
a differentiated display unit, configured to display the synthetic result screen in the extended display, and display the virtual UI on the main screen of the AR glasses at the same time.
[0070]
Further, the AR glasses are in communication connection with the extended display via wired, wireless Wi-Fi, Wi-Fi direct, Google cast, wireless Bluetooth, GSM, CDMA, a local area network, or the Internet.
[0071]
According to a third embodiment of the present invention, a controller is provided, including a memory and a processor, wherein the memory stores computer programs, and the programs can implement the steps of the method when executed by the processor.
[0072]
According to a fourth embodiment of the present invention, a computer-readable storage medium is provided for storing computer programs, wherein the programs can implement the steps of the method when executed by a computer or a processor.
[0073]
Compared with the prior art, the present invention has obvious advantages and beneficial effects. By means of the above technical solutions, by adopting the real-time picture projection method and apparatus of the AR glasses screen, the controller and the medium provided by the present invention, considerable technical progress and practicability can be achieved, an extensive industrial use value is provided, and the present invention at least has the following advantages:
[0074]
the present invention does not need to depend on the screen mirror image via the differentiated content display of the AR glasses screen and the extended display, and the main screen of the glasses does not need to display the camera picture, so that the viewing experience of the glasses user is not reduced; the present invention does not need to set up an additional server or the like to achieve projection, thereby reducing the development cost, enabling rapid deployment, and improving the user experience.
[0075]
The above description is only an overview of the technical solutions of the present invention. In order to understand the technical means of the present invention more clearly, the present invention can be implemented in accordance with the contents of the description, and in order to make the above and other objectives, features and advantages of the present invention more comprehensible, the following specific preferred embodiments are listed and are described in detail below in conjunction with the drawings.

Brief Description of the Drawings

[0076]
Fig. 1 is a schematic diagram of a real-time picture projection scenario of an AR glasses screen provided by an embodiment of the present invention;
[0077]
Fig. 2 is a flow diagram of a real-time picture projection method based on an Android AR glasses screen provided by an embodiment of the present invention;
[0078]
Fig. 3 is a construction schematic diagram of an extended screen provided by an embodiment of the present invention;
[0079]
Fig. 4 is a schematic diagram of synchronously presenting a virtual UI on a map layer of the second Android native interface provided by an embodiment of the present invention;
[0080]
Fig. 5 is a schematic diagram of a real-time picture projection apparatus of an AR glasses screen provided by an embodiment of the present invention.
[0081]
Reference Signs
[0082]
1: Display detection module 2: Image collection module
[0083]
3: Picture synthesis module 4: Differentiated display module
[0084]
10: Real-time picture projection apparatus of AR glasses screen
[0085]
Detailed Description of the Embodiments
[0086]
In order to further illustrate the technical means and effects adopted by the present invention to achieve predetermined purposes of the present invention, the specific embodiments and effects thereof of a real-time picture projection method and apparatus of an AR glasses screen, a controller and a medium provided according to the present invention are described in detail below, in combination with the drawings and preferred embodiments. For the convenience of description, the specific embodiments of the present invention are described in detail based on an Android platform. Those skilled in the art can understand that the embodiments of the present invention can be implemented on non-Android platforms (such as IOS, Windows or the like) .
[0087]
One or more embodiments of the present invention provide a real-time picture projection method based on an Android AR glasses screen. Contents presented in the screen are projected onto an extended display by using the of the Android AR glasses, and the application scenario is shown in Fig. 1.
[0088]
The method according to the embodiment of the present invention specifically includes the following steps, as shown in Fig. 2:
[0089]
Step S1: whether the Android AR glasses are connected with an external extended display is detected.
[0090]
The Android system provides a native interface for detecting whether a mobile device accesses the extended display, and if the device accesses the extended display, related information of the extended display can be obtained in the Android system through a display manager (Display Manager) or a media router (Media Router) .. Therefore, as an example, the step S1 includes:
[0091]
detecting whether the Android AR glasses are connected with the external extended display through a first Android native interface, wherein the first Android native interface includes a Display Manager interface and a Media Router interface, and the first Android native interface is an example of the first interface in the claims. The Display Manger interface is used for managing multiple displays and their related attributes, and the Media Router interface allows application programs to control the routing of media channels and streams from the current device to an external loudspeaker and a target device.
[0092]
The Android AR glasses can form a communication connection with the external extended display in the following ways:
[0093]
wired: for example, a high-definition multimedia interface (High-Definition Multimedia Interface, referred to as HDMI) cable, a universal serial bus (Universal Serial Bus, referred to as USB) cable, and the like;
[0094]
wireless Wi-Fi (Wireless Fidelity) : also known as a wireless hotspot in Chinese;
[0095]
Wi-Fi direct (Wi-Fi Direct) ;
[0096]
Miracast standard: a wireless display standard based on the Wi-Fi direct, 3C (Computer, Communications, Consumer-Electronics) apparatuses that support this standard can share video pictures in a wireless manner, for example, a mobile phone can directly play videos or photos on a TV or other apparatuses through the Miracast without using any connecting lines or the wireless hotspot.
[0097]
Google Cast (Google cast) is a Google service, which is used for projecting the pictures of application programs supporting the Google Cast, for example, YouTube (representing the name of a video website) , onto an Android TV;
[0098]
wireless Bluetooth;
[0099]
global system/Standard for Mobile Communications (Global System/Standard for Mobile Communication (s) , referred to as GSM) ;
[0100]
Code division multiple access (Code Division Multiple Access, referred to as CDMA) ;
[0101]
a local area network; and
[0102]
the Internet.
[0103]
The Android AR glasses and the external extended display support a variety of communication connection modes, so that the development difficulty is low, and the deployment is fast.
[0104]
Step S2: if yes, a camera picture is collected through a camera disposed on the AR glasses, and a virtual UI is collected through a main screen of the AR glasses.
[0105]
A field camera picture collected by the AR glasses camera is consistent with the real field of view contents of the glasses user, and the main screen of the AR glasses collects the virtual UI, that is, the virtual user interface in the glasses screen. As an example, an Android camera (Camera1 or Camera2) can be used to capture the camera picture and manage and operate related functions and attributes of the camera.
[0106]
Step S3: picture synthesis is performed based on the camera picture and the virtual UI to generate a synthetic result picture, and the synthetic result picture is sent to the extended display, as shown in Fig. 3.
[0107]
As an example, display contents can be constructed for the extended display through an Android Presentation (Presentation) interface, the Presentation is a special dialog box whose purpose is to display contents on an auxiliary display screen, the Presentation is associated with target display when created, and the context and resource configuration thereof are configured according to a displayed metric. In the step S3, the performing picture synthesis based on the camera picture and the virtual UI specifically includes:
[0108]
step S31, defining a second Android native interface subclass, wherein the second Android native interface subclass includes a Presentation interface, and the second Android native interface is an example of the second interface in the claims;
[0109]
step S32, associating a map layer with the second native interface within an initial creation cycle of the life cycle of the second Android native interface;
[0110]
step S33, presenting the camera picture at the bottom of the map layer of the second Android native interface, which is specifically implemented by calling the interface of the camera and declaring the operation authority on the camera in an application; and
[0111]
step S34, synchronously presenting the virtual UI in the map layer of the second Android native interface, and overlapping the camera picture and the virtual UI to generate the synthetic result picture.
[0112]
After the camera picture of the AR glasses is introduced into the extended display, the virtual UI in the AR glasses screen needs to be synchronously and dynamically displayed in the extended display, so that the contents viewed by the glasses user through the AR glasses can be truly presented in the extended display for the viewing of the others. The synchronous and dynamic display of the virtual UI in the AR glasses screen in the extended display refers to an action of continuously obtaining a drawing cache for the UI map layer of the main screen of the AR glasses, and a process of performing re-drawing in the screen of the extended display through a canvas (Canvas) tool, Canvas is an Android native interface, which allows a developer to render custom graphics on the Canvas or modify the existing views and customize their appearances. As an example, in the step S34, the synchronously presenting the virtual UI in the map layer of the second Android native interface, as shown in Fig. 4, includes:
[0113]
step S341: obtaining a drawing cache of the virtual UI, wherein the drawing cache is a full-screen virtual UI drawing cache or a partial virtual UI drawing cache. The partial virtual UI drawing cache can be obtained in the screen according to the development requirements, or the full-screen virtual UI drawing cache in the screen is obtained, the Android provides an interface for obtaining the drawing cache for the view, and the developer only needs to obtain the view cache of a specific view according to the development requirements or according to the view declaration, and generate the corresponding bitmap.
[0114]
Step S342: generating a bitmap based on the drawing cache; and
[0115]
step S343: creating a new view extension subclass, and defining the new view extension subclass as a drawing board, declaring that a background color drawn on the canvas is transparent, drawing the bitmap in the drawing board, adding the bitmap to the map layer of the second Android native interface, and superposing the map layer of the second Android native interface on the camera picture.
[0116]
In order to reduce the operation load on the device caused by the continuous acquisition of the drawing cache, a continuous action of obtaining the drawing cache is performed only when the view changes, that is, the partial virtual UI drawing cache is obtained, and when the drawing cache to be obtained is the partial virtual UI drawing cache, the step S34 further includes:
[0117]
Step S3411, an extension subclass of a view or frame layout is constructed to serve as a view container, and it can be declared in the view container that the drawing cache of the present view container is obtained.
[0118]
Step S3412, a callback event is monitored, the corresponding partial virtual UI drawing cache is obtained when the view changes, and the partial virtual UI drawing cache is placed in the view container.
[0119]
It should be noted that, in the step S341, the corresponding full-screen virtual UI drawing cache can also be obtained by monitoring the callback event when the view changes.
[0120]
Step S3413, the bitmap is generated based on the partial virtual UI drawing cache in the view container, and the bitmap is drawn on the drawing board. Through the step S3411 to the step S3413, the contents of all virtual UI drawing caches that need to be synchronized into the extended display are placed in the view container, and the view container converts and draws these virtual UIs contained therein on the corresponding drawing board. The virtual UI contents beyond the view container are not drawn in the extended display. In this way, the developer can decide which virtual UIs in the AR glasses can be drawn on the extended display and which virtual UIs do not need to be drawn on the extended display, thereby reducing the operation load on the device caused by the continuous acquisition of the drawing cache.
[0121]
It is declared in the the view container that the drawing cache of the present view container is obtained, and the bitmap of the view container is drawn on the drawing board in the Presentation through the callback event monitored by a drawing monitoring (OnPreDrawListener) interface and a rolling monitoring (OnScrollChangedListener) interface.
[0122]
OnPreDrawListener represents the interface definition of the callback to be called when a view tree is about to be drawn, and OnScrollChangedListener represents the interface definition of the callback to be called when some contents in the view tree are scrolled. These two monitored callback events can be implemented to ensure that the action of obtaining the drawing cache is triggered only when any view changes, thereby saving device resources. The bitmap is generated via the obtained drawing cache, and the bitmap is defined as a listener for monitoring the generation of a new bitmap, whenever a new bitmap is generated, a callback event is triggered, and the listener for generating the new bitmap is declared in the drawing board, the callback event of the new bitmap is to draw the generated new bitmap on the current drawing board and declare the background color of the canvas drawing to be transparent, the drawing board is added to a bound map layer of the Presentation and is superimposed on the camera browsing screen, so that a superimposed screen including the camera picture of the AR glasses and the UI picture of the main screen of the AR glasses is presented in the extended display at the same time.
[0123]
Step S4: differentiated content display is performed on the extended display and the main screen of the AR glasses.
[0124]
As an example, the step S4 includes:
[0125]
step S41: binding the defined second Android native interface subclass with the extended display detected through the first Android native interface; and
[0126]
step S42: displaying the synthetic result screen in the extended display, and displaying the virtual UI on the main screen of the AR glasses at the same time, thereby realizing the differentiated content display of the extended display and the main screen of the AR glasses.
[0127]
It should be noted that, the user may use an extended display that is not compatible with the resolution of the AR glasses as the output of the projection screen. If the UI is simply drawn according to the resolution of the extended display, deformation will occur, so it is necessary to set relevant sizes according to the resolution of the glasses or the resolution of the extended display as the reference, or perform equal ratio scaling according to the needs to achieve a better output effect. As an example, when the screen resolution of the AR glasses and the resolution of the extended display are inconsistent, the method further includes:
[0128]
step S51: setting the height and the width of the view container to be consistent with the display width and the display height of the extended display;
[0129]
step S52: obtaining the actual display height and the actual display width of the view container in the AR glasses screen; as an example, it can be declared in a measurement function onMeanure () that the actual display height and the actual display width of the view container in the AR glasses are returned within the cycle, a class called display size assistant is created, a height and a width are defined in the size record class, the data type is an integer, two obtaining methods and two setting methods are established for setting and obtaining the height and the width respectively, and the setting methods in the display size assistant are called to respectively store the actual display height and the actual display width of the view container in the AR glasses;
[0130]
step S53: obtaining the display height and the display width of the extended display in the drawing board, as an example, the display height and the display width of the extended display can be displayed by a display matrics (DisplayMetrics) interface in the drawing board, the DisplayMetrics interface is an Android native interface and is a structure for describing the general information of related display, for example, the size, density and font scaling thereof;
[0131]
step S54: dividing the display height of the extended display by the actual display height of the view container in the AR glasses screen to obtain a height multiple;
[0132]
step S55: dividing the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple; and
[0133]
step S56: performing equal ratio scaling on the drawing board according to the height multiple and the width multiple, specifically, the equal ratio scaling can be performed on the drawing board drawing board according to a horizontal scaling ratio (ScaleX) interface and a vertical scaling ratio (ScaleY) interface of the drawing board, in this way, the bitmap transmitted from the AR glasses is presented in the extended display in the form of equal ratio scaling when redrawn, thereby realizing good viewing experience of the extended display.
[0134]
In the embodiment of the present invention, a two-dimensional UI virtual picture in an Android main screen can also be synchronized in the extended display on an Android platform installed on the Android AR glasses according to the Android UI architecture, the developers can synchronize a three-dimensional virtual UI picture into the extended display through the architecture, for example, graphic language surface view (GLSurfaceView, Android native view components that can represent complex three-dimensional image objects) is added to the AR glasses and the extended display, respectively, the two GLSurfaceView views render the same image object at the same time, and maintain the rendering synchronization of the two GLSurfaceViews, thereby further enhancing the implementation effect of the present solution.
[0135]
The method described in the embodiment of the present invention does not rely on the Android screen mirror image, but adopts the differentiated content display mode using the extended display and the main screen of the AR glasses, and the main screen of the glasses does not display the browsing picture of the camera, so the experience of the glasses user while using the glasses does not need to be reduced. In addition, the embodiment of the present invention does not rely on any background multimedia stream service and can support the multimedia of relevant protocols through Wi-Fi direct, for example, support smart TVs or set-top boxes of Google Cast and MirrorCast, and also support wired link, for example, it is connected with a high-definition multimedia interface cable to project the picture of the AR glasses to the extended display by using the OTG protocol. USB On-The-Go is often abbreviated as USB OTG and is a supplementary standard for the USB 2.0 specification. It can make a USB device, such as a player or a mobile phone, change from a USB peripheral device to a USB host and communicate with other USB devices. Under normal circumstances, these USB devices and USB hosts that support the OTG, such as desktop computers or laptops, are still used as USB peripherals. The embodiment of the present invention also supports the screen projection of multimedia stream service from the cloud. Picture projection can also be performed in a designated display devices in a local area network through the back-end multimedia stream service in the local area network, so that the application range is wide.
[0136]
According to a second embodiment of the present invention, a real-time picture projection apparatus 10 based on an Android AR glasses screen is provided, as shown in Fig. 5, including a display detection module 1, an image collection module 2, a picture synthesis module 3 and a differentiated display module 4, wherein the display detection module 1 is configured to detect whether Android AR glasses are connected with an external extended display; the image collection module 2 is configured to, when the AR glasses are connected with the external extended display, collect a camera picture through a camera disposed on the AR glasses, and collect a virtual UI through a main screen of the AR glasses; the picture synthesis module 3 is configured to perform picture synthesis based on the camera picture and the virtual UI to generate a synthetic result picture, and send the synthetic result picture to the extended display; and the differentiated display module 4 is configured to perform differentiated content display on the extended display and the main screen of the AR glasses.
[0137]
The Android AR glasses can form a communication connection with the external extended display in the following ways:
[0138]
wired: for example, a high-definition multimedia interface (High-Definition Multimedia Interface, referred to as HDMI) cable, a universal serial bus (Universal Serial Bus, referred to as USB) cable, and the like;
[0139]
wireless Wi-Fi (Wireless Fidelity) : also known as a wireless hotspot in Chinese;
[0140]
Wi-Fi direct (Wi-Fi Direct) ;
[0141]
Miracast standard: a wireless display standard based on the Wi-Fi direct, 3C (Computer, Communications, Consumer-Electronics) apparatuses that support this standard can share video pictures in a wireless manner, for example, a mobile phone can directly play videos or photos on a TV or other apparatuses through the Miracast without using any connecting lines or the wireless hotspot.
[0142]
Google Cast (Google cast) is a Google service, which is used for projecting the pictures of application programs supporting the Google Cast, for example, YouTube (representing the name of a video website) , onto an Android TV;
[0143]
wireless Bluetooth;
[0144]
global system/Standard for Mobile Communications (Global System/Standard for Mobile Communication (s) , referred to as GSM) ;
[0145]
Code division multiple access (Code Division Multiple Access, referred to as CDMA) ;
[0146]
a local area network; and
[0147]
the Internet.
[0148]
The Android AR glasses and the external extended display support a variety of communication connection modes, so that the development difficulty is low, and the deployment is fast.
[0149]
As an example, the display detection module 1 is specifically configured to: detect whether the Android AR glasses are connected with the external extended display through a first Android native interface, wherein the first Android native interface includes a display manager interface and a media router interface.
[0150]
As an example, display contents can be constructed for the extended display through Android Presentation, the picture synthesis module 3 includes an interface definition sub-module, a map layer association sub-module, a first picture presenting sub-module and a second picture presenting sub-module, wherein the interface definition sub-module is configured to define a second Android native interface subclass, wherein the second Android native interface includes a presentation interface; the map layer association sub-module is configured to associate a map layer for the second native interface within an initial creation cycle of the life cycle of the second Android native interface; the first picture presenting sub-module is configured to present the camera picture at the bottom of the map layer of the second Android native interface; and the second picture presenting sub-module is configured to synchronously present the virtual UI in the map layer of the second Android native interface, and overlapping the camera picture and the virtual UI to generate the synthetic result picture.
[0151]
After the camera picture of the AR glasses is introduced into the extended display, the virtual UI in the AR glasses screen needs to be synchronously and dynamically displayed in the extended display, so that the contents viewed by the glasses user through the AR glasses can be truly presented in the extended display for the viewing of the others. The synchronous and dynamic display of the virtual UI in the AR glasses screen in the extended display refers to an action of continuously obtaining a drawing cache for the UI map layer of the main screen of the AR glasses, and a process of performing re-drawing in the screen of the extended display through a Canvas tool, Canvas is an Android native interface, which allows the developer to render custom graphics on the Canvas or modify the existing views and customize their appearances. As an example, the second picture presenting sub-module includes a drawing cache obtaining unit, a bitmap generation unit and a bitmap superposition unit, wherein the drawing cache obtaining unit is configured to obtain a drawing cache of the virtual UI, the drawing cache is a full-screen virtual UI drawing cache or a partial virtual UI drawing cache, the partial virtual UI drawing cache can be obtained in the screen according to the development requirements, or the full-screen virtual UI drawing cache in the screen is obtained, the Android provides an interface for obtaining the drawing cache for the view, and the developer only needs to obtain the view cache of a specific view according to the development requirements or according to the view declaration, and generate the corresponding bitmap. The bitmap generation unit is configured to generate a bitmap based on the drawing cache; and the bitmap superposition unit is configured to create a new view extension subclass, define the new view extension subclass as a drawing board, declare that a background color drawn on the canvas is transparent, draw the bitmap in the drawing board, add the bitmap to the map layer of the second interface, and superpose the map layer of the second interface on the camera picture.
[0152]
In order to reduce the operation load on the device caused by the continuous acquisition of the drawing cache, a continuous action of obtaining the drawing cache is performed only when the view changes, that is, the partial virtual UI drawing cache is obtained, and when the drawing cache to be obtained is the partial virtual UI drawing cache, the second picture presenting sub-module further includes a container construction sub-unit, a container storage sub-unit and a drawing sub-unit, wherein the container construction sub-unit is configured to construct an extension subclass of a view or frame layout to serve as a view container; the container storage sub-unit, configured to monitor a callback event, obtain the corresponding partial virtual UI drawing cache when the view changes, and place the partial virtual UI drawing cache in the view container; and the drawing sub-unit is configured to generate the bitmap based on the partial virtual UI drawing cache in the view container, and draw the bitmap on the drawing board.
[0153]
When the drawing cache is the partial virtual UI drawing cache, the drawing cache obtaining unit places the contents of all virtual UI drawing caches that need to be synchronized into the extended display in the view container, and the view container converts and draws these virtual UIs contained therein on the corresponding drawing board. The virtual UI contents beyond the view container are not drawn in the extended display. In this way, the developer can decide which virtual UIs in the AR glasses can be drawn on the extended display and which virtual UIs do not need to be drawn on the extended display, thereby reducing the operation load on the device caused by the continuous acquisition of the drawing cache.
[0154]
It is declared in the the view container that the drawing cache of the present view container is obtained, and the bitmap of the view container is drawn on the drawing board in the Presentation through the callback event monitored by a drawing monitoring (OnPreDrawListener) interface and a rolling monitoring (OnScrollChangedListener) interface.
[0155]
OnPreDrawListener represents the interface definition of the callback to be called when a view tree is about to be drawn, and OnScrollChangedListener represents the interface definition of the callback to be called when some contents in the view tree are scrolled. These two monitored callback events can be implemented to ensure that the action of obtaining the drawing cache is triggered only when any view changes, thereby saving device resources. The bitmap is generated via the obtained drawing cache, and the bitmap is defined as a listener for monitoring the generation of a new bitmap, whenever a new bitmap is generated, a callback event is triggered, and the listener for generating the new bitmap is declared in the drawing board, the callback event of the new bitmap is to draw the generated new bitmap on the current drawing board and declare the background color of the canvas drawing to be transparent, the drawing board is added to a bound map layer of the Presentation and is superimposed on the camera browsing screen, so that a superimposed screen including the camera picture of the AR glasses and the UI picture of the main screen of the AR glasses is presented in the extended display at the same time.
[0156]
As an example, the differentiated display module 4 includes a binding unit and a differentiated display unit, wherein the binding unit is configured to bind the defined second Android native interface subclass with the extended display detected through the first Android native interface; and the differentiated display unit is configured to display the synthetic result screen in the extended display, and display the virtual UI on the main screen of the AR glasses at the same time, thereby realizing the differentiated content display of the extended display and the main screen of the AR glasses.
[0157]
It should be noted that, the user may use an extended display that is not compatible with the resolution of the AR glasses as the output of the projection screen. If the UI is simply drawn according to the resolution of the extended display, deformation will occur, so it is necessary to set relevant sizes according to the resolution of the glasses or the resolution of the extended display as the reference, or perform equal ratio scaling according to the needs to achieve a better output effect. As an example, when the screen resolution of the AR glasses and the resolution of the extended display are inconsistent, the apparatus further includes a parameter setting unit, a first parameter obtaining unit, a second parameter obtaining unit, a height multiple determining unit, a width multiple determining module and an equal ratio scaling unit, wherein the parameter setting unit is configured to create a view container, and set the height and the width of the view container to be consistent with the display width and the display height of the extended display; the first parameter obtaining unit is configured to obtain the actual display height and the actual display width of the view container on the AR glasses screen; the second parameter obtaining unit is configured to obtain the display height and the display width of the extended display in the drawing board; the height multiple determining unit is configured to divide the display height of the extended display by the actual display height of the view container on the AR glasses screen to obtain a height multiple; the width multiple determining module is configured to divide the display height of the extended display by the actual display width of the view container on the AR glasses screen to obtain a width multiple; and the equal ratio scaling unit is configured to perform equal ratio scaling on the drawing board according to the height multiple and the width multiple.
[0158]
The embodiment of the present invention further provides a controller, including a memory and a processor, wherein the memory stores computer programs, and the programs can implement the steps of the real-time picture projection method based on the Android AR glasses when executed by the processor.
[0159]
The embodiment of the present invention further provides a computer-readable storage medium for storing computer programs, wherein the programs can implement the steps of the real-time picture projection method based on the Android AR glasses when executed by a computer or a processor.
[0160]
In the embodiment of the present invention, the differentiated content display of the AR glasses screen and the extended display does not need to depend on the screen mirror image, and the main screen of the glasses does not need to display the camera picture, so that the viewing experience of the glasses user is not reduced; and the present invention does not need to set up an additional server or the like to achieve projection, thereby reducing the development cost, enabling rapid deployment, and improving the user experience.
[0161]
Although the embodiments of the present invention are described based on the Android system, those skilled in the art can understand that one or more embodiments of the present invention can be implemented on an operating system using the same or similar functions as the aforementioned interfaces.
[0162]
In addition, although the embodiments of the present invention are described based on the AR glasses, the present invention is not limited thereto. Those skilled in the art can understand that the projections of products for implementing the superposition of the virtual user interface and the real field of view, such as VR glasses and AR helmets, are all applicable to the technical solutions of the present invention.
[0163]
The above descriptions are only the preferred embodiments of the present invention, and doe not limit the present invention in any form. Although the present invention has been disclosed above by the preferred embodiments, the present invention is not limited thereto. Any skilled familiar with this art can make a few changes or modifications by using the technical contents disclosed above to serve as the equivalent embodiments of the equivalent changes, without departing from the scope of the technical solutions of the present invention. However, any simple modifications, equivalent changes and modifications made to the above embodiments according to the technical essence of the present invention, without departing the contents of the technical solutions of the present invention, still fall within the scope of the technical solutions of the present invention.

Claims

[Claim 1]
A real-time picture projection method of an AR glasses screen, comprising: detecting whether AR glasses are connected with an external extended display; if yes, collecting a camera picture through a camera disposed on the AR glasses, and collecting a virtual UI through a main screen of the AR glasses; performing picture synthesis based on the camera picture and the virtual UI to generate a synthetic result picture, and sending the synthetic result picture to the extended display; and performing differentiated content display on the extended display and the main screen of the AR glasses.
[Claim 2]
The real-time picture projection method of the AR glasses screen according to claim 1, wherein, the detecting whether AR glasses are connected with an external extended display comprises: detecting whether the AR glasses are connected with the external extended display through a first interface, wherein the first interface includes a display manager interface and a media router interface.
[Claim 3]
The real-time picture projection method of the AR glasses screen according to claim 2, wherein, the performing picture synthesis based on the camera picture and the virtual UI comprises: defining a second interface subclass, wherein the second interface includes a presentation interface; associating a map layer with the second interface within an initial creation cycle of the life cycle of the second interface; presenting the camera picture at the bottom of the map layer of the second interface; and synchronously presenting the virtual UI in the map layer of the second interface, and overlapping the camera picture and the virtual UI to generate the synthetic result picture.
[Claim 4]
The real-time picture projection method of the AR glasses screen according to claim 3, wherein, the synchronously presenting the virtual UI in the map layer of the second interface comprises: obtaining a drawing cache of the virtual UI, wherein the drawing cache is a full-screen virtual UI drawing cache or a partial virtual UI drawing cache; generating a bitmap based on the drawing cache; and creating a new view extension subclass, defining the new view extension subclass as a drawing board, declaring that a background color drawn on the canvas is transparent, drawing the bitmap in the drawing board, adding the bitmap to the map layer of the second interface, and superposing the map layer of the second interface on the camera picture.
[Claim 5]
The real-time picture projection method of the AR glasses screen according to claim 4, wherein, when the drawing cache to be obtained is the partial virtual UI drawing cache, constructing an extension subclass of a view or frame layout to serve as a view container; monitoring a callback event, obtaining the corresponding partial virtual UI drawing cache when the view changes, and placing the partial virtual UI drawing cache in the view container; and generating the bitmap based on the partial virtual UI drawing cache in the view container, and drawing the bitmap on the drawing board.
[Claim 6]
The real-time picture projection method of the AR glasses screen according to claim 5, wherein, when the screen resolution of the AR glasses and the resolution of the extended display are inconsistent, the method further comprises: setting the height and the width of the view container to be consistent with the display width and the display height of the extended display; obtaining the actual display height and the actual display width of the view container in the AR glasses screen; obtaining the display height and the display width of the extended display in the drawing board; dividing the display height of the extended display by the actual display height of the view container in the AR glasses screen to obtain a height multiple; dividing the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple; and performing equal ratio scaling on the drawing board according to the height multiple and the width multiple.
[Claim 7]
The real-time picture projection method of the AR glasses screen according to claim 3, wherein, the performing differentiated content display on the extended display and the main screen of the AR glasses comprises: binding the defined second interface subclass with the extended display detected through the first interface; and displaying the synthetic result screen in the extended display, and displaying the virtual UI on the main screen of the AR glasses at the same time.
[Claim 8]
The real-time picture projection method of the AR glasses screen according to any of claims 1-7, wherein, the AR glasses are in communication connection with the extended display via wired, wireless Wi-Fi, Wi-Fi direct, Google cast, wireless Bluetooth, GSM, CDMA, a local area network, or the Internet.
[Claim 9]
A real-time picture projection apparatus of an AR glasses screen, comprising: a display detection module, configured to detect whether AR glasses are connected with an external extended display; an image collection module configured to, when the AR glasses are connected with the external extended display, collect a camera picture through a camera disposed on the AR glasses, and collect a virtual UI through a main screen of the AR glasses; a picture synthesis module, configured to perform picture synthesis based on the camera picture and the virtual UI to generate a synthetic result picture, and send the synthetic result picture to the extended display; and a differentiated display module, configured to perform differentiated content display on the extended display and the main screen of the AR glasses.
[Claim 10]
The real-time picture projection apparatus of the AR glasses screen according to claim 9, wherein, the display detection module is specifically configured to: detect whether the AR glasses are connected with the external extended display through a first interface, wherein the first interface includes a display manager interface and a media router interface.
[Claim 11]
The real-time picture projection apparatus of the AR glasses screen according to claim 10, wherein, the picture synthesis module comprises: an interface definition sub-module, configured to define a second interface subclass, wherein the second interface includes a presentation interface; a map layer association sub-module, configured to associate a map layer with the second interface within an initial creation cycle of the life cycle of the second interface; a first picture presenting sub-module, configured to present the camera picture at the bottom of the map layer of the second interface; and a second picture presenting sub-module, configured to synchronously present the virtual UI in the map layer of the second interface, and overlapping the camera picture and the virtual UI to generate the synthetic result picture.
[Claim 12]
The real-time picture projection apparatus of the AR glasses screen according to claim 11, wherein, the second picture presenting sub-module comprises: a drawing cache obtaining unit, configured to obtain a drawing cache of the virtual UI, wherein the drawing cache is a full-screen virtual UI drawing cache or a partial virtual UI drawing cache; a bitmap generation unit, configured to generate a bitmap based on the drawing cache; and a bitmap superposition unit, configured to create a new view extension subclass, define the new view extension subclass as a drawing board, declare that a background color drawn on the canvas is transparent, draw the bitmap in the drawing board, add the bitmap to the map layer of the second interface, and superpose the map layer of the second interface on the camera picture.
[Claim 13]
The real-time picture projection apparatus of the AR glasses screen according to claim 12, wherein, when the drawing cache is the partial virtual UI drawing cache, the second picture presenting sub-module further comprises: a container construction sub-unit, configured to construct an extension subclass of a view or frame layout to serve as a view container; a container storage sub-unit, configured to monitor a callback event, obtain the corresponding partial virtual UI drawing cache when the view changes, and place the partial virtual UI drawing cache in the view container; and a drawing sub-unit, configured to generate the bitmap based on the partial virtual UI drawing cache in the view container, and draw the bitmap on the drawing board.
[Claim 14]
The real-time picture projection apparatus of the AR glasses screen according to claim 13, wherein, when the screen resolution of the AR glasses and the resolution of the extended display are inconsistent, the apparatus further comprises: a parameter setting unit, configured to set the height and the width of the view container to be consistent with the display width and the display height of the extended display; a first parameter obtaining unit, configured to obtain the actual display height and the actual display width of the view container in the AR glasses screen; a second parameter obtaining unit, configured to obtain the display height and the display width of the extended display in the drawing board; a height multiple determining unit, configured to divide the display height of the extended display by the actual display height of the view container in the AR glasses screen to obtain a height multiple; a width multiple determining module, configured to divide the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple; and an equal ratio scaling unit, configured to perform equal ratio scaling on the drawing board according to the height multiple and the width multiple.
[Claim 15]
The real-time picture projection apparatus of the AR glasses screen according to claim 11, wherein, the differentiated display module comprises: a binding unit, configured to bind the defined second interface subclass with the extended display detected through the first interface; and a differentiated display unit, configured to display the synthetic result screen in the extended display, and display the virtual UI on the main screen of the AR glasses at the same time.
[Claim 16]
The real-time picture projection apparatus of the AR glasses screen according to any of claims 9-15, wherein, the AR glasses are in communication connection with the extended display via wired, wireless Wi-Fi, Wi-Fi direct, Google cast, wireless Bluetooth, GSM, CDMA, a local area network, or the Internet.
[Claim 17]
A controller, comprising a memory and a processor, wherein the memory stores computer programs, and the programs implement the steps of the method according to any of claims 1-8 when executed by the processor.
[Claim 18]
A computer-readable storage medium, for storing computer programs, wherein the programs implement the steps of the method according to any of claims 1-8 when executed by a computer or a processor.

Drawings

[ Fig. 1]  
[ Fig. 2]  
[ Fig. 3]  
[ Fig. 4]  
[ Fig. 5]