[go: up one dir, main page]

CN112950791B - Display method and related device - Google Patents

Display method and related device

Info

Publication number
CN112950791B
CN112950791B CN202110377394.2A CN202110377394A CN112950791B CN 112950791 B CN112950791 B CN 112950791B CN 202110377394 A CN202110377394 A CN 202110377394A CN 112950791 B CN112950791 B CN 112950791B
Authority
CN
China
Prior art keywords
layer
projection
area
virtual object
electrode layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110377394.2A
Other languages
Chinese (zh)
Other versions
CN112950791A (en
Inventor
林明田
周伟彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110377394.2A priority Critical patent/CN112950791B/en
Publication of CN112950791A publication Critical patent/CN112950791A/en
Application granted granted Critical
Publication of CN112950791B publication Critical patent/CN112950791B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)

Abstract

本申请实施例公开了一种显示方法和相关装置,至少涉及人工智能中的机器学习,AR设备的显示组件中包括在显示方向上重叠设置的投影层和遮蔽层,且遮蔽层在投影层之后。AR设备将虚拟对象投射在投影层上,为了避免环境光与虚拟对象上光影重叠导致虚拟对象的视觉效果难以匹配真实环境,确定虚拟对象在投影层上的投影区域,根据投影层和遮蔽层的位置对应关系,在遮蔽层中确定对应投影区域的第一区域,在增强显示设备投射虚拟对象时,将第一区域的状态从透光状态切换为遮蔽状态,在遮蔽层中除第一区域之外的第二区域保持为透光状态。从而在不影响用户查看真实场景中物体的前提下,提升虚拟对象的显示效果与真实感,提高用户的使用体验。

The embodiment of the present application discloses a display method and related devices, which at least relate to machine learning in artificial intelligence. The display component of the AR device includes a projection layer and a shielding layer that are overlapped in the display direction, and the shielding layer is behind the projection layer. The AR device projects the virtual object on the projection layer. In order to avoid the overlap of ambient light and light and shadow on the virtual object, which makes it difficult for the visual effect of the virtual object to match the real environment, the projection area of the virtual object on the projection layer is determined. According to the position correspondence between the projection layer and the shielding layer, the first area corresponding to the projection area is determined in the shielding layer. When the enhanced display device projects the virtual object, the state of the first area is switched from a light-transmitting state to a shielding state, and the second area except the first area in the shielding layer is kept in a light-transmitting state. Thereby, the display effect and realism of the virtual object are improved without affecting the user's viewing of objects in the real scene, thereby improving the user's experience.

Description

Display method and related device
Technical Field
The application relates to the field of augmented reality, in particular to a display method and a related device.
Background
Augmented reality (Augmented Reality, AR) technology is a technology that smartly merges virtual information with the real world, and a user can observe virtual objects, such as virtual animated figures, virtual indicators, etc., projected based on AR technology in the real scene through an AR device. From the perspective of the user through the AR device, the virtual object projected by the AR technology should be as if it were actually in a real scene, thereby bringing a brand new visual and interactive experience to the user.
However, the virtual object actually belongs to the projection of the AR device on the display interface of the AR device, and when a user observes through the AR device, the virtual object can generate light and shadow overlapping with the real scene, so that the fusion degree of the virtual object observed by the user through the AR device and the real scene is not high, and the reality is lacking.
Disclosure of Invention
In order to solve the technical problems, the application provides a display method and a related device, which are used for improving the fusion degree of a real scene and a virtual object and improving the sense of reality when a user observes through AR equipment.
The embodiment of the application discloses the following technical scheme:
In one aspect, the present application provides a display method, the method comprising:
Determining a projection area of a virtual object on a projection layer, wherein the projection layer is arranged in a display component of the augmented reality device, and the display component further comprises a shielding layer which is overlapped with the projection layer in a display direction, and the shielding layer is positioned behind the projection layer in the display direction;
Determining a first area of the shielding layer corresponding to the projection area according to the position corresponding relation between the projection layer and the shielding layer;
and switching the state of the first area from a light transmission state to a shielding state, and keeping a second area except the first area in the shielding layer in the light transmission state, wherein the area in the light transmission state in the shielding layer does not block the transmission of the ambient light, and the area in the shielding state in the shielding layer blocks the transmission of the ambient light.
In another aspect, the present application provides an augmented reality device, the device including a display assembly, a projection assembly, and a processing assembly, the display assembly including a projection layer and a mask layer disposed overlapping one another in a display direction, the mask layer being positioned behind the projection layer in the display direction;
The projection component is used for projecting a virtual object on the projection layer;
The processing component is used for determining a first area of the shielding layer corresponding to the projection area according to the position corresponding relation between the projection layer and the shielding layer when determining the projection area of the virtual object on the projection layer, switching the state of the first area from a light transmission state to a shielding state, and keeping a second area except the first area in the shielding layer in the light transmission state, wherein the area in the light transmission state in the shielding layer does not block the transmission of ambient light, and the area in the shielding state in the shielding layer blocks the transmission of the ambient light.
In another aspect, the application provides a computer device comprising a processor and a memory:
The memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of the above aspect according to instructions in the program code.
In another aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program for executing the method described in the above aspect.
In another aspect, embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method described in the above aspect.
As can be seen from the above technical solution, the display assembly of the AR device includes a projection layer and a shielding layer that are disposed to overlap in a display direction, where the shielding layer is behind the projection layer. The AR device projects the virtual object on the projection layer to realize the visual effect of the virtual object in the real scene, in order to avoid that the visual effect of the virtual object is difficult to match with the real environment due to the overlapping of the ambient light and the light shadow on the virtual object, the projection area of the virtual object on the projection layer needs to be determined, the first area corresponding to the projection area is determined in the shielding layer according to the position corresponding relation between the projection layer and the shielding layer, when the virtual object is projected by the enhanced display device, the state of the first area is switched from the light transmission state to the shielding state, and the second area except the first area in the shielding layer is kept to be the light transmission state. Therefore, when a user uses the AR device, as the shielding layer is far away from the user relative to the projection layer, when the virtual object is projected, the first area in the shielding layer, which is switched into the shielding state, can prevent ambient light from penetrating through the projection area, and normal display of the virtual object on the user can not be influenced, so that the display quality of the virtual object observed by the user is more real, and abnormal transparency and distortion are not easy to appear due to superposition influence of ambient light serving as a background in a real scene. The second area in the shielding layer still keeps in a light transmission state when the virtual object is projected, and the transmission of ambient light is not blocked, so that a user can normally observe the real environment around the virtual object, and AR visual experience of the virtual object in the real environment is obtained.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of an application scenario of a display method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an AR device according to an embodiment of the present application;
Fig. 3 is a schematic diagram of AR glasses according to an embodiment of the present application;
FIG. 4 is a flowchart of a display method according to an embodiment of the present application;
FIG. 5 is a display link diagram of an AR device displaying a virtual object;
FIG. 6 is a schematic diagram of a user observing a virtual object using an AR device in the related art;
FIG. 7 is a display link diagram of an AR device displaying a virtual object;
FIG. 8a is a schematic diagram of an embodiment of an optical display architecture;
FIG. 8b is a schematic diagram of an embodiment of an optical display architecture;
FIG. 8c is a schematic diagram of an embodiment of an optical display architecture;
FIG. 8d is a schematic diagram of an embodiment of an optical display architecture;
FIG. 8e is a schematic diagram of an embodiment of an optical display architecture;
fig. 9 is a schematic diagram of a display link of an AR device displaying a virtual object according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an electrochromic layer according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an operation principle of an electro-variable grating module according to an embodiment of the present application;
FIG. 12 is a schematic diagram of an operation principle of an electro-variable grating module according to an embodiment of the present application;
FIG. 13 is a schematic diagram of an operation principle of an electro-variable grating module according to an embodiment of the present application;
FIG. 14 is an equivalent circuit diagram of an electro-variable grating module corresponding to a display pixel according to an embodiment of the present application;
FIG. 15 is an equivalent circuit diagram of an electro-variable grating module corresponding to a display pixel according to an embodiment of the present application;
FIG. 16 is a schematic view of an electrochromic layer according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings.
With the development of science and technology, AR devices are becoming more popular and gradually appear in actual work and life of people, but compared with Virtual Reality (VR) devices, the AR devices have poor display effects, such as poor effects of Virtual objects in terms of contrast of display colors and color saturation. Meanwhile, in an actual use environment, color cast problems also occur to the unavoidable display effect of the virtual object due to the change and complexity of the use scene, so that the fusion degree of the virtual object and the real scene is low, the sense of reality is lacking, and the use experience of a user is affected.
Based on the above, the embodiment of the application provides a display method and a related device, which are used for improving the sense of reality of a virtual object in a real scene and improving the use experience of a user.
The display method provided by the embodiment of the application is realized based on artificial intelligence, wherein the artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) is a theory, a method, a technology and an application system which simulate, extend and expand human intelligence by using a digital computer or a machine controlled by the digital computer, sense environment, acquire knowledge and acquire an optimal result by using the knowledge. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision.
The artificial intelligence technology is a comprehensive subject, and relates to the technology with wide fields, namely the technology with a hardware level and the technology with a software level. Artificial intelligence infrastructure technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
In the embodiment of the application, the mainly related artificial intelligence software technology comprises the directions of the computer vision technology, the machine learning/deep learning and the like.
The display method provided by the application can be applied to display equipment with data processing capability, such as terminal equipment and servers. The terminal device may be a smart phone, a desktop computer, a notebook computer, a tablet computer, an intelligent sound box, a smart watch, an AR device, etc., but is not limited thereto; the server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud computing service. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
The display device may be provided with computer vision technology capabilities. Computer Vision (CV) is a science of studying how to "look" a machine, and more specifically, to replace a human eye with a camera and a Computer to perform machine Vision such as recognition and measurement on a target, and further perform graphic processing to make the Computer process an image more suitable for human eye observation or transmission to an instrument for detection. As a scientific discipline, computer vision research-related theory and technology has attempted to build artificial intelligence systems that can acquire information from images or multidimensional data. Computer vision techniques typically include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D techniques, virtual reality, augmented reality, synchronous positioning, and map construction, among others.
The display device may be machine learning capable. Machine learning (MACHINE LEARNING, ML) is a multi-domain interdisciplinary, involving multiple disciplines such as probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory, and the like. It is specially studied how a computer simulates or implements learning behavior of a human to acquire new knowledge or skills, and reorganizes existing knowledge structures to continuously improve own performance. Machine learning is the core of artificial intelligence, a fundamental approach to letting computers have intelligence, which is applied throughout various areas of artificial intelligence. Machine learning and deep learning typically include techniques such as artificial neural networks, confidence networks, reinforcement learning, transfer learning, induction learning, teaching learning, and the like.
In the display method provided by the embodiment of the application, the adopted artificial intelligent model mainly relates to application of a computer vision technology, and fusion of the virtual object and the real scene is realized through related technologies such as 3D, VR, AR and the like in the computer vision technology.
In order to facilitate understanding of the technical scheme of the present application, the display method provided by the embodiment of the present application is introduced by using the terminal as the display device in combination with the actual application scenario.
Referring to fig. 1, the application scenario of a display method provided by an embodiment of the present application is shown. In the application scenario shown in fig. 1, the terminal device 100 is AR glasses (an AR device), and a user can see a virtual object fused with a real scene by wearing the AR glasses.
The display component of the AR glasses is a lens, and the lens includes a projection layer 110 and a shielding layer 120. In the display direction, the projection layer 110 and the shielding layer 120 are disposed to overlap, and in the display direction, the shielding layer 120 follows the projection layer 110. As shown in fig. 1, the projection layer 110 and the masking layer 120 are both located on the lens, with the projection layer 110 being closer to the user than the masking layer 120.
The AR glasses project the virtual object on the projection layer 110, and the user can view not only the virtual object but also the real environment through the AR glasses, so as to realize the visual effect of the virtual object in the real scene.
Virtual objects in a real scene may be affected by ambient light, for example, when the background of the virtual object appears dark, the color of the virtual object may deepen. In order to avoid that the visual effect of the virtual object is difficult to match with the real environment due to the overlapping of the ambient light and the light shadow on the virtual object, the projection area of the virtual object on the projection layer 110 needs to be determined, and the first area corresponding to the projection area is determined in the shielding layer 120 according to the position corresponding relation between the projection layer 110 and the shielding layer 120.
In the application scenario shown in fig. 1, the virtual object is in a "cross" pattern, the projection area of the virtual object projected onto the projection layer 110 is in a "cross" pattern, and in the shielding layer 120, the first area corresponding to the projection area is also in a "cross" pattern.
When the AR glasses project the virtual object, the state of the first area is switched from the light-transmitting state to the shielding state, and the second area except the first area in the shielding layer is kept in the light-transmitting state. As shown in fig. 1, in the shielding layer 120, the first region does not allow the ambient light to pass therethrough, and the second region allows the ambient light to pass therethrough. At this time, the ambient light does not enter the projection layer 110 through the shielding layer 120, and for the virtual object projected on the projection area, the virtual object is not affected by the ambient light, such as abnormal transparency, distortion, and the like, which are not easily affected by the superposition of the ambient light as the background in the real scene, so that the normal display of the virtual object on the user is not affected, and the display quality of the virtual object observed by the user is more real.
Meanwhile, the second area in the shielding layer 120 still keeps in a light-transmitting state when the virtual object is projected, and the transmission of ambient light is not blocked, so that a user can normally observe the real environment around the virtual object, AR visual experience of the virtual object in the real environment is obtained, the virtual object is better fused with the real environment, the sense of reality of the virtual object in the real scene is improved, and the use experience of the user is improved.
The following describes a display method provided by the embodiment of the application by using a terminal device as a display device in combination with a drawing.
Referring to fig. 2, which is a schematic diagram of an AR device according to an embodiment of the present application, the AR device 200 includes a display component 210, a projection component 220, and a processing component 230. The display assembly includes a projection layer 211 and a shielding layer 212 that are overlapped in a display direction, and the shielding layer 212 may cover a display range of the virtual object on the projection layer after the shielding layer 212 is located on the projection layer 211 in the display direction.
Referring to fig. 3, a schematic diagram of AR glasses according to an embodiment of the present application is shown. The AR glasses 200 belong to an AR device, the display component 210 of the AR glasses 200 is a lens part, the projection component 220 may be a part of a component located on a glasses frame, the processing component 230 may be a part of a component located on a glasses frame, and the projection component 220 and the processing component 230 may be integrated together or separately provided, and the configuration of the projection component and the processing component is not limited in particular in the present application. The projection layer 211 and the masking layer 212 are both overlaid on the lens, the projection layer 211 being closer to the user than the masking layer 212.
Wherein the projection component 220 is configured to project a virtual object on the projection layer 211, such as a virtual object on a lens. The processing component 230 is configured to determine a projection area of the virtual object on the projection layer 211, so as to control the shielding layer 212 to adjust a state of a first area corresponding to the projection area to a shielding state when the virtual object is projected, and the processing component 230 is described below with reference to fig. 4.
Referring to fig. 4, a flowchart of a display method according to an embodiment of the present application is shown. As shown in FIG. 4, the display method includes the steps of
S401, determining a projection area of the virtual object on a projection layer.
The AR device in the related art is affected by the ambient light when the virtual object is projected, and the reason why the virtual object is affected by the ambient light will be described in principle.
Referring to fig. 5, a display link diagram of an AR device displaying a virtual object is shown. When a user uses the AR device, the real environment is required to be overlapped, and the ambient light in the real environment is overlapped with the light of the virtual object projected by the AR device and is emitted to human eyes, so that the influence of the ambient light in the real environment on the virtual object cannot be avoided, and the ambient light in the real environment is adversely affected by the imaging of the virtual object.
For example, referring to fig. 6, a diagram is shown in which a user views a virtual object using an AR device in the related art. In fig. 6, the user uses AR glasses, and the virtual object projected by the AR glasses is a cuboid small person, the color of the cuboid small person is white, and the background environment where the cuboid small person is located is a triangle background with uneven color. The cuboid small person observed by human eyes is influenced by the ambient light of the triangle background, such as the cuboid small person in fig. 6 is divided into a plurality of colors with different depths, and the colors of some areas are deepened, and even the deepening degrees of different areas are different, so that the display effect of the cuboid small person is very false, the reality is lacked, and the use experience of a user is influenced.
Based on this, in order to reduce the influence of ambient light on the virtual object, the related art proposes a scheme of increasing the display brightness of the AR device and a scheme of reducing the light transmittance of the AR device, which are described below, respectively.
Scheme one, increase the display brightness of the AR device.
To give the user better color contrast and color saturation, the intensity of the light emitted by the display device may be increased. Referring to fig. 7, a display link diagram of an AR device displaying a virtual object is shown. The intensity of the ambient light is measured in real time through the ambient light intensity sensor, or the intensity of the light required by the AR equipment for emitting and displaying the virtual object is properly increased according to the intensity of the ambient light under the conventional use condition, the brightness proportion of the virtual object in the light reaching the eyes of the user is improved, and the virtual object is highlighted in the real scene, so that the user has better visual experience.
And in the second scheme, the light transmittance of the AR equipment is reduced.
In order to reduce the influence of the ambient light on the display effect of the virtual object, the light transmittance of the ambient light of the lenses in the AR equipment can be reduced, and when the ambient light is reduced as much as possible, the influence of the ambient light on the virtual object is weakened, so that a user has better visual experience.
However, by increasing the display brightness of the AR device, the power consumption of the AR device is obviously increased, the corresponding battery capacity and the corresponding heat dissipation scheme are required to be increased, the volume of the AR device is also required to be further increased, the weight of the AR device worn by the user is large, and the use experience of the AR device is obviously reduced. The second scheme is that the light transmittance of the lenses in the AR equipment is reduced, and the display effect is improved well, but the real environment is difficult to see by a user due to the fact that the overall light transmittance is low, so that the actual interactive operation experience of the user is greatly affected. In addition, the scheme I or the scheme II aggravates the sense of unrealism of the virtual object, and cannot bring better use experience to the user.
Based on the method, the device and the system, the light transmittance corresponding to the area for displaying the virtual object is reduced, the light transmittance corresponding to the area for displaying other non-displayed virtual objects is kept, and the light transmittance corresponding to the area for displaying the virtual object is dynamically adjusted, so that the display effect of the virtual object is improved, the sense of reality of the virtual object is improved, and the use experience of a user is improved on the premise that the user is not influenced to view objects in a real scene.
Thus, in order to adjust the light transmittance corresponding to the region where the virtual object is displayed, the projection region of the virtual object on the projection layer is determined.
The projection assembly may employ different optical display architecture schemes for projecting the virtual object on the projection layer, such as a prism scheme as shown in fig. 8a, a bird bath (birdbath) scheme as shown in fig. 8b, a free-form surface scheme as shown in fig. 8c, an off-axis holographic lens scheme as shown in fig. 8d, and an optical waveguide (Lightguide) scheme as shown in fig. 8e, according to which the projection area of the virtual object on the projection layer may be determined.
S402, determining a first area of the shielding layer corresponding to the projection area according to the position corresponding relation between the projection layer and the shielding layer.
From the foregoing, the projection layer and the shielding layer are different layers in the display assembly, the projection layer is used for displaying the virtual object, and the shielding layer is used for controlling the transmittance of the ambient light. In order to reduce the influence of the ambient light on the virtual object, the light transmittance corresponding to the projection area where the virtual object is located in the projection layer is reduced, so that the first area corresponding to the projection area can be determined in the shielding layer according to the position corresponding relation between the projection layer and the shielding layer, and the light transmittance of the first area is reduced.
S403, switching the state of the first area from the light transmission state to the shielding state, and keeping the second area except the first area in the shielding layer in the light transmission state.
In the shielding layer, the area in the light transmission state does not obstruct the transmission of the ambient light, the area in the shielding state obstructs the transmission of the ambient light, after the first area is determined, the state of the first area is switched from the light transmission state to the shielding state, and the transmission of the ambient light through the shielding layer is obstructed, so that the light transmittance of the first area is reduced. In the display direction, the first area is positioned behind the projection area, and the light transmittance of the first area is reduced, so that the ambient light entering the projection layer can be reduced, the conditions of abnormal transparency, distortion and the like of the virtual object caused by the superposition influence of the ambient light are reduced, the contrast and the color saturation of the virtual object can be obviously improved, and the display quality of the virtual object observed by a user is more real. Moreover, the volume and the power consumption of the AR equipment are not obviously increased, and compared with the first scheme, the portability, the man-machine friendliness and the like of the AR equipment are greatly improved.
The embodiment of the application does not specifically limit the blocking degree, for example, the light transmittance of the first area can be properly reduced, so that the proportion of the ambient light entering the first area is reduced, the intensity of the ambient light is reduced, and the influence of the ambient light on the virtual object is reduced. For another example, the light transmittance of the first area is changed to 0, and the ambient light is not allowed to enter the first area, so that the ambient light does not affect the virtual object.
The following describes an example of changing the light transmittance of the first area to 0, referring to fig. 9, which is a schematic diagram of a display link of an AR device displaying a virtual object according to an embodiment of the present application. In the first area, the shielding layer can block all ambient light from entering, so that the virtual object in the projection area cannot overlap the ambient light in the real environment, and human eyes can only observe the light of the virtual object projected by the AR device. Therefore, the influence of the ambient light in the real environment on the virtual object is avoided, the display of the virtual object is more materialized and more like an object existing in the real environment, the virtual object and the real environment are better fused, and the interactive experience of a user is improved.
In the shielding layer, not only the state of the first area is switched to the shielding state, but also the second area except the first area in the shielding layer is kept to be in the light transmission state, so that the entry of ambient light into the second area is not influenced, and a user can clearly observe the real environment. But only reduce the luminousness of first region, the regional luminousness that corresponds of the regional that has maximally reserved non-display virtual object, fine assurance AR equipment and user's interactive experience and ability under the real environment, can both satisfy virtual object's display effect and still satisfy the user and can see the dual demand of the real environment of reality more clearly.
As can be seen from the above technical solution, the display assembly of the AR device includes a projection layer and a shielding layer that are disposed to overlap in a display direction, where the shielding layer is behind the projection layer. The AR device projects the virtual object on the projection layer to realize the visual effect of the virtual object in the real scene, in order to avoid that the visual effect of the virtual object is difficult to match with the real environment due to the overlapping of the ambient light and the light shadow on the virtual object, the projection area of the virtual object on the projection layer needs to be determined, the first area corresponding to the projection area is determined in the shielding layer according to the position corresponding relation between the projection layer and the shielding layer, when the virtual object is projected by the enhanced display device, the state of the first area is switched from the light transmission state to the shielding state, and the second area except the first area in the shielding layer is kept to be the light transmission state. Therefore, when a user uses the AR device, as the shielding layer is far away from the user relative to the projection layer, when the virtual object is projected, the first area in the shielding layer, which is switched into the shielding state, can prevent ambient light from penetrating through the projection area, and normal display of the virtual object on the user can not be influenced, so that the display quality of the virtual object observed by the user is more real, and abnormal transparency and distortion are not easy to appear due to superposition influence of ambient light serving as a background in a real scene. The second area in the shielding layer still keeps in a light transmission state when the virtual object is projected, and the transmission of ambient light is not blocked, so that a user can normally observe the real environment around the virtual object, and AR visual experience of the virtual object in the real environment is obtained.
It should be noted that, the projection component projects the virtual object onto the projection layer by timing to project a series of static solid images (frames) of a plurality of virtual objects, and continuously changing at a certain frequency and moving (playing) at a speed (for example, 16 frames per second), and the speed of the projection component timing to project the virtual object is the refresh frequency of the virtual object.
The method comprises the steps of determining that the determined frequency of a projection area of a virtual object on a projection layer is consistent with the refresh frequency of the virtual object, and enabling the first area corresponding to the projection area to be correspondingly shielded at the same time point, namely, the projection area is replaced by the virtual object on the projection layer, and the first area is correspondingly replaced by the shielding layer, so that the light transmittance corresponding to the projection area where the virtual object is located in the projection layer is dynamically controlled, namely, the refresh frequency of the virtual object is matched with the first area from a light transmittance state to a shielding state, when dynamic changes, such as movement, rotation, expansion and shrinkage, materialization or transparentization, occur on the virtual object, the shielding state and the area of the shielding layer can be synchronously changed with the changes of the virtual object on the projection layer, the first area is adjusted at any time along with the changes of the projection area, the display effect of the virtual object is improved, and the use immersion sense of a user is not damaged.
The embodiment of the present application is not particularly limited to the shielding layer capable of locally changing the light transmittance, and the following description will take the shielding layer as an electrochromic layer as an example.
Electrochromic materials are used for the electrochromic layers, the electrochromic materials have electrochromic properties, electrochromic is the optical properties (reflectivity, transmissivity, absorptivity and the like) of the materials, stable and reversible color change occurs under the action of an applied electric field, and the appearance is represented as reversible change of color and transparency.
Thus, the shielding layer may be an electrochromic layer, and the processing component switches the state of the first region from the light transmitting state to the shielding state by changing the electric field intensity of the first region, and maintains the second region in the light transmitting state by maintaining the electric field intensity of the second region, i.e. by changing the electric field intensity of the electrochromic layer, the light transmittance of the ambient light is controlled.
Therefore, when the virtual object dynamically changes in the projection area of the projection layer, the electric field intensity is correspondingly adjusted in the first area of the shielding layer, the electric field intensity of the second area is kept, complex control logic is not needed, the light transmittance of the ambient light of the first area can be reduced only through simple electric field intensity control, and even the ambient light is not allowed to pass through, so that the display quality of the virtual object observed by a user is more real. The second area in the shielding layer is still kept in a light-transmitting state when the virtual object is projected, and the transmission of ambient light is not blocked, so that a user can normally observe the real environment around the virtual object, the AR visual experience of the virtual object in the real environment is obtained, the double requirements of improving the display effect of the virtual object and not damaging the use immersion of the user are met.
The electrochromic layer electrochromic grating module will be described as an example.
Referring to fig. 10, a schematic diagram of an electrochromic layer according to an embodiment of the present application is shown. The electro-variable grating module includes a first electrode layer 1001, a second electrode layer 1003, and a liquid crystal layer 1002. The liquid crystal layer 1002 is between the first electrode layer 1001 and the second electrode layer 1003, in which a plurality of liquid crystals are filled as shielding materials, and the liquid crystals in the liquid crystal layer may be dark in light-blocking color, such as black, and the more the dark color in light-blocking color is, the better the effect of reducing the transmittance of ambient light, so that the influence of ambient light on the virtual object is reduced as much as possible.
Referring to fig. 11, the working principle of an electro-variable grating module according to an embodiment of the present application is shown. When the liquid crystal is in a natural state, that is, when the electric field intensity between the first electrode layer 1001 and the second electrode layer 1003 is small or the electric field intensity is 0, the liquid crystal is randomly arranged in a free posture, and no gap or small gap exists between the liquid crystals arranged in the free posture, thereby blocking the ambient light from passing through the liquid crystal layer.
Referring to fig. 12, a schematic diagram of an operation principle of an electro-variable grating module according to an embodiment of the present application is shown. The electric field intensity between the first electrode layer 1001 and the second electrode layer 1003 is changed, and the liquid crystal is ordered along the vertical direction of the first electrode layer 1001 or the second electrode layer 1003, so that light is transmitted, thereby achieving light transmission.
Thus, in order to control the light transmittance of the first region, the processing component may control the first electrode layer 1001, and may change the voltage V data of the first electrode layer 1001 in the first region from the first voltage to the second voltage, and if the difference between the second voltage and the voltage V com of the second electrode layer 1003 is smaller than the threshold V lc, i.e., |v data-Vcom|<Vlc, the switching of the state of the first region from the light-transmitting state to the shielding state is achieved because the distance between the first electrode layer 1001 and the second electrode layer 1003 is generally unchanged, i.e., the electric field strength of the liquid crystal layer 1002 is smaller than the field strength required for maintaining the ordered posture of the liquid crystal, the liquid crystal in the first region in the liquid crystal layer 1002 is adjusted from the ordered posture to the free posture, blocking the ambient light from passing through the liquid crystal layer 1002.
By controlling the first electrode layer 1001, the voltage of the first electrode layer 1001 in the second region is kept at the first voltage, and if the difference between the first voltage and the field strength V com of the second electrode layer 1003 is equal to or greater than the threshold V lc, i.e., |v data-Vcom|≥Vlc, since the distance between the first electrode layer 1001 and the second electrode layer 1003 is generally unchanged, i.e., the electric field strength of the liquid crystal layer 1002 is equal to or greater than the field strength required for maintaining the ordered posture of the liquid crystal, the liquid crystal in the second region in the liquid crystal layer is kept in the ordered posture, thereby realizing that the second region is kept in the light-transmitting state.
Therefore, by changing the electric field intensity between the first electrode layer 1001 and the second electrode layer 1003, the liquid crystal layer 1002 can be switched between a shielding state and a light transmitting state, so that the light transmittance of the ambient light is controlled by changing the electric field intensity of the electro-variable grating module, the influence of the ambient light on the virtual object in the first area is reduced, the normal display of the virtual object on the user is not influenced, the transmission of the ambient light in the second area is not blocked, the user is not influenced to observe the real environment except the virtual object, and the user obtains the AR visual experience of the virtual object in the real environment.
It should be noted that the first electrode layer and the second electrode layer are not particularly limited, and for example, the first electrode layer and the second electrode layer may be indium tin oxide (Indium Tin Oxides, ITO) electrodes.
As a possible implementation manner, the processing component may implement state switching of the shielding layer through a switching tube, and the switching tube is taken as a field-effect transistor (FET) as an example, see fig. 13, which is a schematic diagram of an operation principle of the electro-variable grating module provided by an embodiment of the present application. By controlling the voltages of the gate line and the source line, the FET is turned on and off, thereby controlling the voltage V data of the first electrode layer 1001, and further controlling the electric field strength between the first electrode layer 1001 and the second electrode layer 1003, so that the liquid crystal layer 1002 can be switched between a shielding state and a light transmitting state.
The voltage of the second electrode layer is respectively set to a high level and a low level.
In the first mode, the voltage of the second electrode layer is high.
The source electrode of the switching tube is connected with the source electrode line, so that the source electrode of the switching tube has a voltage, the grid electrode of the switching tube is connected with the grid electrode line, the processing component changes the voltage V data of the first electrode layer 1001 in the first area from the first voltage to the second voltage by controlling the grid electrode of the switching tube in the first electrode layer 1001 in the first area, such as electrifying the grid electrode line, the grid electrode of the switching tube has a voltage, and the source electrode and the drain electrode of the switching tube are conducted, so that the voltage V data of the first electrode layer 1001 in the first area changes from the low level to the high level. At this time, the voltage V data of the first electrode layer 1001 and the voltage V com of the second electrode layer 1003 are both at high level, and if |v data-Vcom|<Vlc, the liquid crystal in the first region in the liquid crystal layer 1002 is adjusted from the ordered posture to the free posture, so that the state of the first region is switched from the light-transmitting state to the shielding state.
The processing assembly maintains the voltage of the first electrode layer 1001 in the second region to the first voltage by controlling the gate of the switching transistor in the first electrode layer 1001 in the second region, such as to turn off the gate line, the gate of the switching transistor has no voltage, and the source and the drain of the switching transistor are turned off so that the voltage V data of the first electrode layer 1001 in the second region is maintained to the low level. At this time, the voltage V data of the first electrode layer 1001 is at a low level, the voltage V com of the second electrode layer 1003 is at a high level, and if the electric field strength of the liquid crystal layer 1002 is greater than the required field strength, the liquid crystal in the second region in the liquid crystal layer is kept in an ordered posture, so as to keep the second region in a light-transmitting state.
In the second mode, the voltage of the second electrode layer is low.
The source electrode of the switching tube is connected with the source electrode line, so that the source electrode of the switching tube has voltage, the grid electrode of the switching tube is connected with the grid electrode line, the processing component changes the voltage V data of the first electrode layer 1001 in the first area from the first voltage to the second voltage by controlling the grid electrode of the switching tube in the first electrode layer 1001 in the first area, such as powering off the grid electrode line, the grid electrode of the switching tube has no voltage, and the source electrode and the drain electrode of the switching tube are cut off, so that the voltage of the first electrode layer 1001 in the first area is changed from the high level to the low level. At this time, the voltage V data of the first electrode layer 1001 and the voltage V com of the second electrode layer 1003 are both low, and if |v data-Vcom|<Vlc, the liquid crystal in the first region in the liquid crystal layer 1002 is adjusted from the ordered posture to the free posture, so that the state of the first region is switched from the light-transmitting state to the shielding state.
The processing component causes the voltage of the first electrode layer in the first region to be maintained at a high level by controlling the gate of the switching transistor in the first electrode layer 1001 in the second region, such as energizing the gate line, the gate of the switching transistor having a voltage, and the source and drain of the switching transistor being turned on. At this time, the voltage V data of the first electrode layer 1001 is at a high level, the voltage V com of the second electrode layer 1003 is at a low level, and if the electric field strength of the liquid crystal layer 1002 is greater than the required field strength, the liquid crystal in the second region in the liquid crystal layer is kept in an ordered posture, so as to keep the second region in a light-transmitting state.
Compared with the mode one in which the voltage of the second electrode layer is high, the mode two in which the voltage of the second electrode layer is low, the AR device is more power-saving, the power consumption is reduced, and the battery capacity of the AR device can also be reduced, so that the volume of the AR device can be smaller, and the convenience and the man-machine friendliness of the AR device are improved.
As a possible implementation manner, the switching tube provided in the first electrode layer 1001 corresponds to the display pixel of the virtual object, the second electrode layer 1003 includes an electrode provided opposite to the switching tube, and an electric field may be formed after the first electrode layer and the second electrode layer are energized so as to switch the state of the first region from the light transmitting state to the shielding state by changing the electric field strength of the first region, and to maintain the second region in the light transmitting state by maintaining the electric field strength of the second region.
Therefore, the display pixel lattice change capability, namely the overall or local light transmittance change of the shielding layer, can be realized through the electro-variable grating, so that the better display effect of the AR equipment is ensured. The number of the switching transistors provided for each pixel is not particularly limited, for example, an equivalent circuit diagram as shown in fig. 13 may be provided for each pixel, or a plurality of equivalent circuit diagrams as shown in fig. 13 may be provided, and an equivalent circuit diagram as shown in fig. 13 is provided for each pixel.
Fig. 14 and 15 are equivalent circuit diagrams of an electro-variable grating module corresponding to a display pixel according to an embodiment of the present application. In fig. 14 and 15, two gate lines are included, corresponding to the gate lines where the voltage V even and the voltage V odd are located, respectively, wherein the display pixel corresponding to the switching transistor controlled by the gate line where the voltage V even is located in the first area, and the display pixel corresponding to the switching transistor controlled by the gate line where the voltage V odd is located in the second area.
The light transmittance control of the light rays in the first region and the second region is achieved by controlling the gate lines where the voltage V even and the voltage V odd are located, and in fig. 14, if the gate line where the voltage V even is located is powered on, the gate line where the voltage V odd is located is powered off, so that the difference between the voltage V even and the voltage V com of the second electrode layer 1003 is smaller than the threshold V lc, and the state of the first region is switched from the light transmitting state to the shielding state, and the second region is kept in the light transmitting state. In fig. 15, if the gate line at the voltage V even is powered off, the gate line at the voltage V odd is powered off, so that the state of the first region is switched from the shielding state to the light transmitting state and the second region is maintained in the light transmitting state by controlling the difference between the voltage V even and the voltage V com of the second electrode layer 1003 to be greater than or equal to the threshold V lc.
It should be noted that, when the shielding layer is an electro-variable grating module, the optical display architecture scheme of the projection component in the AR device has relatively good effect due to the optical waveguide scheme, which is more convenient for the electro-variable grating module to be closely attached to the display component during manufacturing, and better realizes alignment of the display pixels.
As a possible implementation manner, reference is made to fig. 16, which is a schematic diagram of an electrochromic layer provided in an embodiment of the present application. The electro-variable grating module may further include a polarizer layer 1004, an upper glass layer, and a lower glass layer 1006. In the display direction, a polarizer layer 1004, an upper glass layer 1005, a first electrode layer 1001, a liquid crystal layer 1002, a second electrode layer 1003, and a lower glass layer 1006 are sequentially from front to back.
Next, an example will be described in which the user wears AR glasses as shown in fig. 3.
The processing component of the AR glasses can firstly extract or separate the virtual object to be displayed, a projection area of the virtual object in the projection layer is determined, the processing component drives the electro-variable grating module in the shielding layer when the projection component projects the virtual object to the projection layer of the display component, so that liquid crystal in the electro-variable grating module in a first area in the shielding layer corresponding to the projection area is adjusted to be a free posture from an ordered posture, the effect of shielding the ambient light of the first area is achieved, the liquid crystal in a second area is kept in the ordered posture, and the ambient light normally passes through. The region needing to be shielded by the variable electro-grating module is matched with the change condition of the projection region where the virtual object is projected by the projection component at any time, so that synchronous adjustment is realized by matching the change of the region needing to be shielded by the variable electro-grating module with various changes such as movement, rotation, materialization or transparentization of the virtual object in a visible region.
Therefore, the electro-rheological grating module can be matched with the virtual object to dynamically shade light, so that ambient light outside the AR equipment is prevented from passing through the lens and overlapping with light of the AR equipment for displaying the virtual object, display defects are caused when the display color is off-color and the background environment is complex, a user can see an image with more consistent display effect, the virtual object is an actual object which is more similar to actual life, but is not a semitransparent image which is frequently found on general AR equipment, and the reality is higher. Meanwhile, good light transmittance is still achieved in the non-display area of the virtual object, so that a user can normally complete actual interactive operation, and the use experience of the user is improved.
The foregoing display device may be a computer device, which may be a server or may be a terminal device, and the computer device provided in the embodiments of the present application will be described from the perspective of hardware materialization. Fig. 17 is a schematic structural diagram of a server, and fig. 18 is a schematic structural diagram of a terminal device.
Referring to fig. 17, fig. 17 is a schematic diagram of a server structure provided by an embodiment of the present application, where the server 1400 may vary considerably in configuration or performance, and may include one or more central processing units (central processing units, CPU) 1422 (e.g., one or more processors) and memory 1432, one or more storage mediums 1430 (e.g., one or more mass storage devices) that store applications 1442 or data 1444. Wherein the memory 1432 and storage medium 1430 can be transitory or persistent storage. The program stored in the storage medium 1430 may include one or more modules (not shown), each of which may include a series of instruction operations on a server. Further, the central processor 1422 may be provided in communication with a storage medium 1430 to perform a series of instruction operations in the storage medium 1430 on the server 1400.
The server 1400 can also include one or more power supplies 1426, one or more wired or wireless network interfaces 1450, one or more input/output interfaces 1458, and/or one or more operating systems 1441, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, and the like.
The steps performed by the server in the above embodiments may be based on the server structure shown in fig. 17.
Wherein, the CPU 1422 is configured to perform the following steps:
Determining a projection area of a virtual object on a projection layer, wherein the projection layer is arranged in a display component of the augmented reality device, and the display component further comprises a shielding layer which is overlapped with the projection layer in a display direction, and the shielding layer is positioned behind the projection layer in the display direction;
Determining a first area of the shielding layer corresponding to the projection area according to the position corresponding relation between the projection layer and the shielding layer;
and switching the state of the first area from a light transmission state to a shielding state, and keeping a second area except the first area in the shielding layer in the light transmission state, wherein the area in the light transmission state in the shielding layer does not block the transmission of the ambient light, and the area in the shielding state in the shielding layer blocks the transmission of the ambient light.
Optionally, the CPU 1422 may further perform method steps of any specific implementation of the display method in an embodiment of the present application.
Referring to fig. 18, fig. 18 is a schematic structural diagram of a terminal device according to an embodiment of the present application. Fig. 18 is a block diagram illustrating a part of a structure of a smart phone related to a terminal device according to an embodiment of the present application, where the smart phone includes a Radio Frequency (RF) circuit 1510, a memory 1520, an input unit 1530, a display unit 1540, a sensor 1550, an audio circuit 1560, a wireless fidelity (WIRELESS FIDELITY, wiFi) module 1570, a processor 1580, and a power supply 1590. Those skilled in the art will appreciate that the smartphone structure shown in fig. 18 is not limiting of the smartphone and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The following describes each component of the smart phone in detail with reference to fig. 18:
The RF circuit 1510 is used for receiving and transmitting signals during a message or a call, specifically, receiving downlink information from a base station, processing the downlink information by the processor 1580, and transmitting uplink data to the base station. Generally, RF circuitry 1510 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (Low Noise Amplifier, LNA for short), a duplexer, and the like. In addition, the RF circuitry 1510 may also communicate with networks and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to global system for mobile communications (Global System of Mobile communication, GSM), general packet Radio Service (GENERAL PACKET), code division multiple access (Code Division Multiple Access, CDMA), wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA), long term evolution (Long Term Evolution, LTE), email, short message Service (Short MESSAGING SERVICE, SMS), etc.
The memory 1520 may be used to store software programs and modules, and the processor 1580 implements various functional applications and data processing of the smartphone by running the software programs and modules stored in the memory 1520. The memory 1520 may mainly include a storage program area that may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), etc., and a storage data area that may store data created according to the use of the smart phone (such as audio data, a phonebook, etc.), etc. In addition, memory 1520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1530 may be used to receive input numerical or character information and generate key signal inputs related to user settings and function control of the smart phone. In particular, the input unit 1530 may include a touch panel 1531 and other input devices 1532. The touch panel 1531, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 1531 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 1531 may include two parts, a touch detection device and a touch controller. The touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1580, and can receive and execute commands sent by the processor 1580. In addition, the touch panel 1531 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1530 may include other input devices 1532 in addition to the touch panel 1531. In particular, other input devices 1532 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 1540 may be used to display information input by a user or information provided to the user and various menus of the smart phone. The display unit 1540 may include a display panel 1541, and optionally, the display panel 1541 may be configured in the form of a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1531 may cover the display panel 1541, and when the touch panel 1531 detects a touch operation thereon or thereabout, the touch operation is transferred to the processor 1580 to determine the type of touch event, and then the processor 1580 provides a corresponding visual output on the display panel 1541 according to the type of touch event. Although in fig. 18, the touch panel 1531 and the display panel 1541 are two separate components to implement the input and input functions of the smart phone, in some embodiments, the touch panel 1531 may be integrated with the display panel 1541 to implement the input and output functions of the smart phone.
The smartphone may also include at least one sensor 1550, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 1541 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1541 and/or the backlight when the smartphone is moved to the ear. The accelerometer sensor can detect the acceleration in all directions (generally three axes), can detect the gravity and the direction when the accelerometer sensor is static, can be used for identifying the gesture of the smart phone (such as transverse and vertical screen switching, related games, magnetometer gesture calibration), vibration identification related functions (such as pedometer and knocking), and the like, and other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors and the like which are also configured by the smart phone are not repeated herein.
Audio circuitry 1560, speaker 1561, and microphone 1562 may provide an audio interface between a user and a smart phone. The audio circuit 1560 may transmit the received electrical signal converted from audio data to the speaker 1561 for conversion into audio signals for output by the speaker 1561, while the microphone 1562 may convert the collected audio signals into electrical signals for receipt by the audio circuit 1560 for conversion into audio data for processing by the audio data output processor 1580 for transmission to, for example, another smart phone via the RF circuit 1510 or for output to the memory 1520 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and a smart phone can help a user to send and receive emails, browse webpages, access streaming media and the like through a WiFi module 1570, so that wireless broadband Internet access is provided for the user. Although fig. 18 shows WiFi module 1570, it is understood that it does not belong to the essential constitution of a smartphone, and can be omitted entirely as desired within the scope of not changing the essence of the invention.
Processor 1580 is a control center of the smartphone, connects various parts of the entire smartphone with various interfaces and lines, performs various functions of the smartphone and processes data by running or executing software programs and/or modules stored in memory 1520, and invoking data stored in memory 1520. In the alternative, the processor 1580 may include one or more processing units, and preferably the processor 1580 may integrate an application processor and a modem processor, wherein the application processor primarily processes operating systems, user interfaces, application programs, and the like, and the modem processor primarily processes wireless communications. It is to be appreciated that the modem processor described above may not be integrated into the processor 1580.
The smart phone also includes a power source 1590 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 1580 via a power management system, such as to provide for managing charging, discharging, and power consumption.
Although not shown, the smart phone may further include a camera, a bluetooth module, etc., which will not be described herein.
In an embodiment of the present application, the memory 1520 included in the smart phone may store program codes and transmit the program codes to the processor.
The processor 1580 included in the smart phone may execute the display method provided in the foregoing embodiment according to the instructions in the program code.
The embodiment of the application also provides a computer readable storage medium for storing a computer program for executing the display method provided in the above embodiment.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the display methods provided in the various alternative implementations of the above aspects.
It will be appreciated by those of ordinary skill in the art that implementing all or part of the steps of the above method embodiments may be implemented by hardware associated with program instructions, where the above program may be stored in a computer readable storage medium, where the program when executed performs the steps including the above method embodiments, and where the storage medium may be at least one of a read-only memory (ROM), a RAM, a magnetic disk, or an optical disk, etc. various media that may store program code.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment is mainly described in a different point from other embodiments. In particular, for the apparatus and system embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, with reference to the description of the method embodiments in part. The apparatus and system embodiments described above are merely illustrative, in which elements illustrated as separate elements may or may not be physically separate, and elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing is only one specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the technical scope of the present application should be included in the scope of the present application. Therefore, the protection scope of the present application should be subject to the protection scope of the claims.

Claims (9)

1. A display method, the method comprising:
determining a projection area of a virtual object on a projection layer, wherein the projection layer is arranged in a display component of augmented reality equipment, the display component further comprises a shielding layer which is overlapped with the projection layer in a display direction, the shielding layer is positioned behind the projection layer in the display direction, the shielding layer is an electro-variable grating module, the electro-variable grating module is closely attached to the display component, the electro-variable grating module comprises a first electrode layer, a second electrode layer and a liquid crystal layer, the first electrode layer is provided with a switch tube corresponding to a display pixel, the second electrode layer comprises an electrode which is arranged opposite to the switch tube, and the voltage of the second electrode layer is low;
Determining a first area of the shielding layer corresponding to the projection area according to the position corresponding relation between the projection layer and the shielding layer;
turning off a source and a drain of a switching tube by controlling a gate of the switching tube in the first electrode layer in the first region to be powered off, so that a voltage of the first electrode layer in the first region is changed from a high level to a low level;
if the difference between the voltage of the first electrode layer and the voltage of the second electrode layer is smaller than a threshold value, the liquid crystal in the first area in the liquid crystal layer is adjusted from an ordered posture to a free posture, so that the state of the first area is switched from the light transmission state to the shielding state;
Switching on a source electrode and a drain electrode of a switching tube in the first electrode layer of a second region outside the first region by controlling the gate of the switching tube to be electrified, so that the voltage of the first electrode layer in the second region is kept at a high level;
if the difference between the voltage of the first electrode layer and the voltage of the second electrode layer is greater than or equal to the threshold value, the liquid crystal in the second area in the liquid crystal layer is kept in an ordered posture, so that the second area is kept in the light-transmitting state, the area in the light-transmitting state in the shielding layer does not obstruct the transmission of ambient light, and the area in the shielding state in the shielding layer obstructs the transmission of ambient light;
The method comprises the steps of determining a projection area of a virtual object on a projection layer, wherein the determination frequency of the projection area of the virtual object on the projection layer is consistent with the refreshing frequency of the virtual object, so that when the projection area of the virtual object on the projection layer is dynamically changed, a first area of the shielding layer corresponding to the projection area and a shielding state can synchronously change with the change of the virtual object on the projection layer, and the first area of the virtual object corresponding to the projection area on the same time point is in a shielding state.
2. The method of claim 1, wherein in the display direction, the electro-variable grating module comprises a polarizer layer, an upper glass layer, the first electrode layer, the liquid crystal layer, the second electrode layer, and a lower glass layer in that order from front to back.
3. The method of claim 1, wherein the liquid crystal in the liquid crystal layer is dark in color to block light.
4. The augmented reality device is characterized by comprising a display assembly, a projection assembly and a processing assembly, wherein the display assembly comprises a projection layer and a shielding layer which are overlapped upwards in a display method, the shielding layer is positioned behind the projection layer in the display direction, the shielding layer is an electro-variable grating module, the electro-variable grating module is closely attached to the display assembly, the electro-variable grating module comprises a first electrode layer, a second electrode layer and a liquid crystal layer, the first electrode layer is provided with a switch tube corresponding to a display pixel, the second electrode layer comprises an electrode which is arranged opposite to the switch tube, and the voltage of the second electrode layer is low;
The projection component is used for projecting a virtual object on the projection layer;
The processing component is used for determining a first area of the shielding layer corresponding to the projection area according to the position corresponding relation between the projection layer and the shielding layer when determining the projection area of the virtual object on the projection layer; turning off a source and a drain of a switching tube by controlling a gate of the switching tube in the first electrode layer in the first region to be powered off, so that a voltage of the first electrode layer in the first region is changed from a high level to a low level; if the difference between the voltage of the first electrode layer and the voltage of the second electrode layer is smaller than a threshold value, the liquid crystal in the first area in the liquid crystal layer is adjusted from an ordered posture to a free posture, so that the state of the first area is switched from the light transmission state to the shielding state; the grid electrode of a switching tube in a first electrode layer of a second region outside the first region is controlled to be electrified, and the source electrode and the drain electrode of the switching tube are conducted, so that the voltage of the first electrode layer in the first region is kept to be high level;
The method comprises the steps of determining a projection area of a virtual object on a projection layer, wherein the determination frequency of the projection area of the virtual object on the projection layer is consistent with the refreshing frequency of the virtual object, so that when the projection area of the virtual object on the projection layer is dynamically changed, a first area of the shielding layer corresponding to the projection area and a shielding state can synchronously change with the change of the virtual object on the projection layer, and the first area of the virtual object corresponding to the projection area on the same time point is in a shielding state.
5. The augmented reality device of claim 4, wherein in the display direction, the electro-variable grating module comprises a polarizer layer, an upper glass layer, the first electrode layer, the liquid crystal layer, the second electrode layer, and a lower glass layer in that order from front to back.
6. The augmented reality device of claim 4, wherein the liquid crystal in the liquid crystal layer is dark in light-blocking color.
7. A computer device, the device comprising a processor and a memory:
The memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of any of claims 1-3 according to instructions in the program code.
8. A computer readable storage medium for storing a computer program for execution by a processor to implement the method of any one of claims 1-3.
9. A computer program product, characterized in that it comprises computer instructions stored in a computer-readable storage medium, from which computer instructions a processor of a computer device reads, which processor executes the computer instructions, causing the computer device to perform the method of any one of claims 1-3.
CN202110377394.2A 2021-04-08 2021-04-08 Display method and related device Active CN112950791B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110377394.2A CN112950791B (en) 2021-04-08 2021-04-08 Display method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110377394.2A CN112950791B (en) 2021-04-08 2021-04-08 Display method and related device

Publications (2)

Publication Number Publication Date
CN112950791A CN112950791A (en) 2021-06-11
CN112950791B true CN112950791B (en) 2025-07-22

Family

ID=76231137

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110377394.2A Active CN112950791B (en) 2021-04-08 2021-04-08 Display method and related device

Country Status (1)

Country Link
CN (1) CN112950791B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115598838A (en) * 2021-06-28 2023-01-13 北京有竹居网络技术有限公司(Cn) Control method and device of AR head-mounted display equipment and electronic equipment
CN113741162B (en) 2021-09-06 2022-11-22 联想(北京)有限公司 Image projection method and system
JP7567762B2 (en) * 2021-12-13 2024-10-16 トヨタ自動車株式会社 AR Glasses
CN114415998B (en) * 2021-12-21 2025-05-02 联想(北京)有限公司 Display method, electronic device and medium
JP2023125867A (en) * 2022-02-28 2023-09-07 富士フイルム株式会社 Glass-type information display device, display control method, and display control program
CN114779948B (en) * 2022-06-20 2022-10-11 广东咏声动漫股份有限公司 Method, device and equipment for controlling instant interaction of animation characters based on facial recognition
CN115061584A (en) * 2022-06-28 2022-09-16 联想(北京)有限公司 Image display method and processor of head-mounted equipment
CN117079168A (en) * 2023-08-03 2023-11-17 歌尔科技有限公司 Image processing method, device, head-mounted display equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105334677A (en) * 2015-11-12 2016-02-17 友达光电股份有限公司 Liquid crystal display device
CN105934902A (en) * 2013-11-27 2016-09-07 奇跃公司 Virtual and augmented reality systems and methods
CN111399230A (en) * 2020-05-12 2020-07-10 潍坊歌尔电子有限公司 Display system and head-mounted display equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1124511C (en) * 1997-03-10 2003-10-15 佳能株式会社 Liquid crystal display, mfg. method therefor, and liquid crystal projector using said display
JP6729389B2 (en) * 2015-04-30 2020-07-22 ソニー株式会社 Display device
CN112639579B (en) * 2018-08-31 2023-09-15 奇跃公司 Spatially resolved dynamic dimming for augmented reality devices

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105934902A (en) * 2013-11-27 2016-09-07 奇跃公司 Virtual and augmented reality systems and methods
CN105334677A (en) * 2015-11-12 2016-02-17 友达光电股份有限公司 Liquid crystal display device
CN111399230A (en) * 2020-05-12 2020-07-10 潍坊歌尔电子有限公司 Display system and head-mounted display equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
蒋明礼 等.《微型计算机硬件组成》.机械工业出版社,2000,112-113. *

Also Published As

Publication number Publication date
CN112950791A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN112950791B (en) Display method and related device
US20230143323A1 (en) Shadow rendering method and apparatus, computer device, and storage medium
US10701346B2 (en) Replacing 2D images with 3D images
US11145096B2 (en) System and method for augmented reality interaction
CN113227942B (en) Audio indicators of user attention in AR/VR environments
US10943388B1 (en) Intelligent stylus beam and assisted probabilistic input to element mapping in 2D and 3D graphical user interfaces
US9864198B2 (en) Head-mounted display
CN104793842B (en) Graphical user interface system, display processing device and input processing device
CN103455969B (en) Image processing method and device
CN111541907B (en) Article display method, apparatus, device and storage medium
WO2016173427A1 (en) Method, device and computer readable medium for creating motion blur effect
CN110335200B (en) Virtual reality anti-distortion method, device and related equipment
CN103558971A (en) Browsing method, browsing device and terminal device
US10257500B2 (en) Stereoscopic 3D webpage overlay
CN105388611A (en) Wearable-type equipment and system for extending display of intelligent equipment
CN117321537A (en) Dynamic power configuration of eye-worn devices
US10592013B2 (en) Systems and methods for unifying two-dimensional and three-dimensional interfaces
CN108476316A (en) A 3D display method and user terminal
CN113010066B (en) Display parameter determination method and device
US10701347B2 (en) Identifying replacement 3D images for 2D images via ranking criteria
CN113130607A (en) Display device, electronic apparatus, control method, device, and readable storage medium
HK40046488A (en) Display method and related apparatus
CN116055627B (en) Screen-off control method, electronic equipment and storage medium
CN111930236A (en) Device control method, device, storage medium and electronic device
US20250029216A1 (en) Method, apparatus, electronic devices, and storage medium of image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40046488

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant