CN109741464B - Method and apparatus for displaying real scenes - Google Patents
Method and apparatus for displaying real scenes Download PDFInfo
- Publication number
- CN109741464B CN109741464B CN201910015161.0A CN201910015161A CN109741464B CN 109741464 B CN109741464 B CN 109741464B CN 201910015161 A CN201910015161 A CN 201910015161A CN 109741464 B CN109741464 B CN 109741464B
- Authority
- CN
- China
- Prior art keywords
- real scene
- display
- scene
- video stream
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Embodiments of the present disclosure disclose methods and apparatus for displaying a real scene. One embodiment of the method comprises: acquiring a video stream of a real scene; acquiring position information of a user; based on the location information, a video stream of the real scene is displayed in the display in a manner merged with the indoor scene. The embodiment can fuse the display of the real scene with the indoor scene, and realize the effect of being placed in the real scene.
Description
Technical Field
The embodiment of the disclosure relates to the technical field of display, in particular to a method and a device for displaying a real scene.
Background
French windows have been developed in great length since the end of the last century and are used in many buildings due to their attractive appearance and good lighting.
However, not all rooms are designed with french windows that allow one to enjoy outdoor landscapes, and even some rooms (e.g., basements or intermediate rooms, etc.) do not have windows. These rooms are typically dimly lit and have a depressed environment when compared to the room with the french window.
In addition, even if the French window is provided, people can feel oppressed if the scenery outside the French window is poor.
Disclosure of Invention
Embodiments of the present disclosure propose methods and apparatuses for displaying a real scene.
In a first aspect, an embodiment of the present disclosure provides a method for displaying a real scene, the method including: acquiring a video stream of a real scene; acquiring position information of a user; based on the location information, a video stream of the real scene is displayed in the display in a manner merged with the indoor scene.
In some embodiments, displaying a video stream of a real scene in a display in a manner that is fused with an indoor scene based on location information, comprises: determining a viewing angle from which the user views the real scene based on the position information; performing visual angle conversion on the video stream of the real scene to make the converted video stream consistent with the real scene observed at the visual angle; and displaying the converted video stream and a preset display template in a display in an overlapping manner, wherein the real scene is displayed in an area surrounded by the preset display template.
In some embodiments, the predetermined display template comprises at least one of a window frame template, a window shade template, a television wall template.
In some embodiments, the method further comprises: determining the light intensity and the light angle of a real scene according to the video stream of the real scene; light is emitted into an indoor scene at a light intensity and a light angle.
In some embodiments, the method further comprises: changing the light intensity and light angle based on the time variation; emitting light to the indoor scene at the changed light intensity and the changed light angle.
In some embodiments, the method further comprises: in response to detecting a change in position of the user, a perspective adjustment and/or a depth adjustment is performed on the real scene displayed in the display based on the changed position information.
In some embodiments, the method further comprises: in response to detecting that the user is facing the display, rendering a real scene displayed in the display, the real scene being displayed in augmented reality.
In some embodiments, the display area of the display includes a first area displaying the real scene and a second area displaying the window frame, the second area surrounding the first area. The method further comprises the following steps: detecting the indoor light intensity; rendering the second region based on the indoor light intensity.
In some embodiments, the method further comprises: and adjusting the display of the window frame in response to the detection of the limb action indicating the opening/closing of the window, so that the window frame is in a state corresponding to the limb action.
In some embodiments, the method further comprises: acquiring an audio stream of a real scene; determining the volume and sound effect corresponding to the position information; and controlling a stereo sound box in the room to play the audio stream of the real scene with volume and effect.
In some embodiments, the method further comprises: in response to detecting a change in the position of the user, the volume and sound effects of the stereo speaker are adjusted based on the changed position information.
In some embodiments, a virtual window is displayed in the display. The method further comprises the following steps: in response to detecting a change in the on-off state of the virtual window, the volume and the sound effect of the stereo speaker are adjusted to correspond to the changed on-off state of the virtual window.
In a second aspect, an embodiment of the present disclosure provides an apparatus for displaying a real scene, the apparatus including: a video acquisition unit configured to acquire a video stream of a real scene; a location acquisition unit configured to acquire location information of a user; a display unit configured to display a video stream of a real scene in a display in a manner of being merged with an indoor scene based on the location information.
In some embodiments, a display unit includes: a perspective determination module configured to determine a perspective from which the user views the real scene based on the location information; a video conversion module configured to perform view angle conversion on a video stream of a real scene so that the converted video stream is consistent with the real scene viewed at a view angle; and the superposition display module is configured to display the converted video stream and a preset display template in a superposition manner in the display, wherein the real scene is displayed in an area surrounded by the preset display template.
In some embodiments, the default display template comprises at least one of a window frame template, a window curtain template, and a television wall template.
In some embodiments, the apparatus further comprises: a live-action light determining unit configured to determine a light intensity and a light angle of a real scene from a video stream of the real scene; a light emitting unit configured to emit light to an indoor scene at a light intensity and a light angle.
In some embodiments, the apparatus further comprises: a light changing unit configured to change the light intensity and the light angle based on a temporal change. Wherein the light emitting unit is further configured to emit light toward the indoor scene with the changed light intensity and the changed light angle.
In some embodiments, the apparatus further comprises: a viewing angle/depth adjustment unit configured to perform viewing angle adjustment and/or depth adjustment on a real scene displayed in the display based on the changed position information in response to detecting the position change of the user.
In some embodiments, the apparatus further comprises: an augmented reality unit configured to render a real scene displayed in the display in response to detecting that the user is oriented toward the display, the real scene being displayed in an augmented reality form.
In some embodiments, the display area of the display includes a first area displaying the real scene and a second area displaying the window frame, the second area surrounding the first area. The device also includes: an indoor light detecting unit configured to detect an indoor light intensity; a rendering unit configured to render the second area based on the indoor light intensity.
In some embodiments, the apparatus further comprises: and the window adjusting unit is configured to adjust the display of the window frame in response to the detection of the limb action indicating the window opening/closing, so that the window frame is in a state corresponding to the limb action.
In some embodiments, the apparatus further comprises: an audio acquisition unit configured to acquire an audio stream of a real scene; a sound effect determination unit configured to determine a volume and a sound effect corresponding to the position information; and the audio output unit is configured to control a stereo sound box in the room to play the audio stream of the real scene with volume and effect.
In some embodiments, the apparatus further comprises: an audio effect adjusting unit configured to adjust a volume and audio effects of the stereo speaker based on the changed position information in response to detecting the position change of the user.
In some embodiments, a virtual window is displayed in the display. The sound effect adjusting unit is further configured to adjust the volume and the effect of the stereo speaker to correspond to the changed on-off state of the virtual window in response to detecting the change in the on-off state of the virtual window.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a storage device having one or more programs stored thereon; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation of the first aspect.
In a fourth aspect, the present application provides a computer readable medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the method and the device for displaying the real scene, the video stream of the real scene and the position information of the user are obtained, and then the video stream of the real scene is displayed in the display in a mode of being fused with the indoor scene based on the position information, so that the display of the real scene and the indoor scene can be fused, and the effect of being placed in the real scene is achieved.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment according to the present disclosure may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for displaying a real world scene according to the present disclosure;
FIG. 3 is a schematic illustration of one application scenario of a method for displaying a real scene according to the present disclosure;
FIG. 4 is a schematic illustration of another application scenario of the method for displaying a real scene according to the present disclosure;
FIG. 5 is a schematic block diagram of an apparatus for displaying a real scene according to the present disclosure;
FIG. 6 is a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the disclosed method for displaying a real world scene or apparatus for displaying a real world scene may be applied.
As shown in fig. 1, system architecture 100 may include an image capture device 101, a video display device 102, an audio output device 103, a network 104, and a host 105. Network 104 is used to provide a medium for communication links between image capture device 101, video display device 102, audio output device 103, and host 105. The network 104 may include various connection types, such as wired, wireless communication links, and so forth.
The image capture device 101 may be an electronic device that supports capturing images of an indoor user, including but not limited to a high definition camera, a binocular vision system, and the like.
The video Display device 102 may be an electronic device having a Display screen with a high resolution (e.g., 8K, 16K, etc.), including but not limited to an LCD (Liquid Crystal Display) device, an OLED (Organic Light-Emitting Diode) Display device, a PDP (Plasma Display Panel) device, an FED (Field Emission Display) device, and the like. The video display device may be mounted on an indoor wall for serving as part or all of the indoor wall.
The audio output device 103 may be an electronic device that supports sound output, including but not limited to a surround sound enclosure or the like.
The host 105 may interact with the image capture device 101, the video display device 102, and the audio output device 103. For example, the host 105 may control the image capture device 101 to capture an image of the user to determine location information of the user. The host 105 may also adjust the video stream based on the user's location information and send the adjusted video stream to the video display device 102 for display. The host 105 may also send the audio stream to the audio output device 103 for playback.
It should be noted that the method for displaying the real scene provided by the embodiment of the disclosure is generally performed by the host 105, and accordingly, the apparatus for displaying the real scene is generally disposed in the host 105.
It should be understood that the number of image capture devices, video display devices, audio output devices, networks, and hosts in fig. 1 is merely illustrative. There may be any number of image capture devices, video display devices, audio output devices, networks, and hosts, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for displaying a real scene in accordance with the present disclosure is shown. The method for displaying a real scene may comprise the steps of:
In this embodiment, an executing subject (e.g., host 105 of fig. 1) of the method for displaying a real scene may obtain a video stream of the real scene from a local or remote location (e.g., cloud). Here, the video stream may be a video stream pre-stored locally, or may be a video stream updated in real time by the cloud. The real scene may include a real outdoor environment including, but not limited to, a mountain top scene, a beach scene, a scene outside a home window, and the like. The video stream of the real scene can be acquired by a 3D or 360-degree high-definition camera device installed in the real scene.
In the present embodiment, an execution subject (e.g., the host 105 of fig. 1) of the method for displaying a real scene may acquire location information of an indoor (i.e., in-room) user. The position information of the user may include information such as a distance and an angle of the user with respect to the display.
In some optional implementations of the embodiment, the executing body may acquire the position information of the user through a specific image capturing device. For example, a binocular vision system may be installed in the room, and the execution subject may acquire a user image acquired by the binocular vision system, and may determine the position information of the user based on the acquired user image and calibration parameters of the binocular vision system.
Although the embodiment describes that the position information of the user is acquired by using the binocular vision principle, the present application is not limited thereto. It will be appreciated by those skilled in the art that the user location information may also be obtained in other suitable ways, for example, by acquiring an image of the user containing a structured light pattern to determine the user location information, etc.
And step 203, displaying a video stream of the real scene in the display in a mode of being fused with the indoor scene based on the position information.
In the present embodiment, an executing subject (e.g., host 105 of fig. 1) of the method for displaying a real scene may adjust a video stream of the real scene based on the position information acquired in step 202, and then display the video stream of the real scene in a display in a manner of being merged with an indoor scene (i.e., an environment in a room). Merging with an indoor scene may refer to viewing a video stream of a real scene from the perspective of a user, as if it were embedded in the real scene, rather than feeling that the video is being played through a display or television. Here, the display may be mounted on an indoor wall for serving as a part of the indoor wall (e.g., a display embedded in the indoor wall to simulate a real window) or all (e.g., a full-face "display wall" tiled by displays). In order to realize the experience of being personally on the scene, the display has the characteristics of large size, high resolution ratio and the like. For example, a display with a resolution of 7680x4320 or higher, 8K (a digital video standard), may be used.
Compared with the mode that the scenery outside the window is enjoyed through the window, the scheme provided by the embodiment can also enjoy outdoor scenes without the window, achieves the effect similar to a sea scene room, a mountain scene room and a French window room, and has no problem of scene color difference because the real scene can be selected. In addition, because a window (such as a French window) is not needed to be arranged, the problem that personal privacy is revealed due to the fact that people enjoy scenery outside the window is avoided, and the requirement for guaranteeing privacy in daily life is met.
In some optional implementations of this embodiment, step 203 may specifically include the following steps:
in a first step, a viewing angle from which the user views the real scene is determined based on the position information. For example, the perspective from which the user views the real scene may be determined by the angle of the user relative to the display. Here, the viewing angle may refer to an angle between a line connecting the user and the center of the display and a direction perpendicular to the display.
And secondly, carrying out visual angle conversion on the video stream of the real scene, so that the converted video stream is consistent with the real scene observed at the visual angle (namely, when the video stream is displayed on a display, a displayed picture is consistent with the scene observed by the user at the visual angle determined in the first step in the real scene).
And thirdly, overlapping and displaying the converted video stream and a preset display template in a display. Wherein, the real scene can be displayed in a display area surrounded by the preset display template. Here, the preset display template is used to realize the transition between the indoor scene and the real scene (to achieve the effect of blending with the indoor scene).
Optionally, the preset display template may include at least one of a window frame template, a curtain template, and a tv wall template. For example, a video stream of a real scene is displayed superimposed with a window frame template, so that it is viewed from the user's perspective as if the window scene were viewed through a real window.
In some optional implementations of this embodiment, the method for displaying a real scene may further include the step of simulating a ground light. The method specifically comprises the following two steps:
first, the light intensity and light angle of the light in the video stream acquired in step 201 are determined.
Then, light is emitted toward the indoor scene at the determined light intensity and light angle.
As an example, a light projection device may be installed on an indoor wall, and the execution body may control the light projection device to emit light into the room at the determined light intensity and light angle, so that the indoor light may look like sunlight projected through a window.
Although this implementation describes determining the intensity and angle of the projected light into the room from the video stream of the real scene, the application is not so limited. Those skilled in the art will appreciate that other suitable means for determining the intensity and angle of the projected light rays may be used. For example, the light intensity and the light angle of the real scene may be determined by a light intensity/angle detection device in the real scene, and then the light may be emitted indoors at the determined light intensity and light angle.
In some optional implementations of this embodiment, the method for displaying a real scene may further include the steps of:
first, the light intensity and light angle are changed based on the temporal change. For example, the light intensity and the light angle may be changed according to a preset correspondence relationship between time and light intensity/angle (such as a time-intensity/time-angle expression set empirically, or a light intensity and a light angle of a real scene detected in real time).
Light is then projected toward the indoor scene at the altered light intensity and light angle.
As an example, in a room of an old age home, a scene outside the room is not visible, a display (as a "window" of the room) and a light projection device of the present embodiment are embedded on a wall (to project light into the room), and the scene of the old living in the room outside the home is displayed in the display. With the lapse of time, the light and shadow projected into the room also gradually change, just as the old people flow while watching the time in the house. The implementation mode can present light and shadow changes indoors, so that the time-varying experience of people can be brought.
In some optional implementations of this embodiment, the method for displaying a real scene may further include: detecting whether the position of the user in the room changes, if detecting that the position of the user changes (for example, the user moves from the position a to the position B in the room), performing view angle adjustment and/or depth adjustment on the video stream of the real scene based on the changed position information. As an example, when the user moves toward the display, the distance between the user and the display becomes smaller, and the range of the real scene displayed on the display is enlarged as the distance is shortened, which is similar to the effect that the range of the scenery outside the window that can be seen as the user moves closer to the window becomes larger, so that the fusion effect of the real scene and the indoor scene can be enhanced, and the experience of being placed in the real scene is improved.
It should be appreciated that when two or more users are present in a room, the effect of the user position change on the display may be masked. That is, even if a user position change is detected, the display screen is not adjusted to avoid a display error.
In some optional implementations of this embodiment, the method for displaying a real scene may further include: in response to detecting that the user is facing (i.e., facing) the display, a real scene displayed in the display is rendered, the real scene being displayed in augmented reality. As an example, the real scene is a sea scene (e.g., a sea scene that can be seen in a sea scenic house), and when the user faces the display, the dolphin is displayed in an augmented reality form to jump out of the sea, so that the user's enjoyment of the scene can be improved, and the user's tedium can be avoided.
In some optional implementations of the present embodiment, the display area of the display may include a first area displaying the real scene and a second area displaying the window frame. Wherein the second region surrounds the first region. Corresponding to the implementation, the method for displaying a real scene may further include: detecting indoor light intensity, and rendering the second area based on the detected indoor light intensity. As an example, in a real scene, sunlight is projected through a window into a room. In this example, the execution subject may render the second area based on the indoor light intensity, the light intensity of the real scene, and the light angle, so that the window frame displayed in the second area is as if it is irradiated by the sunlight. In this way, the fusion of the real scene and the indoor scene can be more natural and real.
Optionally, corresponding to the implementation manner, the method for displaying the real scene may further include: in response to detecting a physical movement (e.g., a gesture) to open and close the window, the window frame displayed in the second area is adjusted such that the adjusted window frame is in a state corresponding to the physical movement. As an example, the user makes a gesture to open a window, and after the execution subject detects the gesture, the angle and position of the window frame displayed in the second area are adjusted as if the window is actually opened by the user.
In some optional implementations of this embodiment, the method for displaying a real scene may further include: firstly, acquiring an audio stream of a real scene (for example, acquiring the audio stream of the real scene through a spatial sound field recording device); then, determining the volume and sound effect corresponding to the position of the user; and finally, controlling the indoor stereo loudspeaker box to play the audio stream of the real scene with the determined volume and effect. This embodiment further enhances the experience of being left in a real scene.
Optionally, the method for displaying the real scene may further include: and in response to detecting the position change of the user, adjusting the volume and the sound effect of the stereo sound box based on the changed position information, so that the sound effect change heard by the user is consistent with the sound effect change heard by walking in a real scene.
It will be appreciated that when two or more users are present in a room, the effect of the user position change on the volume and/or sound effects may be masked. That is, even if a user position change is detected, the stereo speaker is not adjusted to avoid sound confusion.
Optionally, a virtual window is displayed in the display. The method for displaying a real scene may further include: in response to detecting a change in the open-close state of the virtual window (e.g., detecting a gesture of a user opening the window, opening the virtual window), the volume and the effect of the stereo speaker are adjusted to correspond to the open state of the virtual window. As an example, if it is detected that the virtual window is changed from the closed state to the open state, the volume and sound effect of the stereo speaker are adjusted accordingly, so that the sound of the real scene becomes larger and clearer (like the sound effect heard by opening the real window).
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for displaying a real scene according to the present embodiment. In the application scenario of fig. 3, two walls in a room are a display wall 301 formed by splicing 8K displays, wherein two cameras 303 (forming a binocular camera system) are installed on one display wall 301, and a surround sound box 304 is placed in the room. First, the host acquires a video stream of a locally stored sea scene 305. The position information of the user 302 in the room is then obtained by capturing an image of the user 302 through the camera 303 to determine the viewing angle from which the user 302 views the sea scenery 305. Finally, the video stream of the sea scene 305 is adjusted in view angle and depth of field, and is superimposed with the window frame template 306 and then correspondingly displayed on the two-sided display wall, which is equivalent to viewing the sea scene 305 through the two-sided French window, as the user 302 sees in the sea scene room with French window. Simultaneously with the acquisition of the video stream of the sea scenery 305, the audio stream of the sea scenery 305 is acquired, the volume and sound effects of the surround sound enclosures 304 are determined according to the position of the user 302, and the audio stream is played at the determined volume and sound effects, so that the sound of waves or seabirds can be heard as if the user 302 were positioned in a sea backroom.
With continued reference to fig. 4, fig. 4 is a schematic diagram of another application scenario of the method for displaying a real scene according to the present embodiment. In the application scenario of fig. 4, one wall in a room is a display wall formed by splicing 8K displays, a light projection device is installed on the display wall, a user wants to view a city scene 404 seen in a high-rise building, and a camera 401 is installed on a certain high-rise building to acquire the city scene 404 in real time. The city scenery collected by the camera 401 is uploaded to the cloud 402 in real time. First, the host obtains a video stream of a real-time captured urban scene 404 from the cloud 402 and determines the light intensity and light angle of the light in the video stream. Location information of a user (not shown) in the room is then obtained to determine the perspective from which the user views the urban scenery 404. Finally, the view angle and the depth of field of the video stream of the city scene 404 are adjusted, the video stream is displayed on a display wall after being overlapped with the floor window frame template 403 and the curtain template 406, and meanwhile, the light projection device is controlled to project light to the indoor according to the determined light intensity and light angle, so that the shadow 406 can be presented on the ground, namely the city scene 404 is watched through the floor window of the high-rise building, as if a user is positioned in the high-rise building with the floor window.
According to the method provided by the embodiment of the disclosure, the video stream of the real scene and the position information of the user are acquired, and then the video stream of the real scene is displayed in the display in a manner of being fused with the indoor scene based on the position information, so that the display of the real scene and the indoor scene can be fused, and the effect of being placed in the real scene is realized.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an apparatus for displaying a real scene, which corresponds to the method embodiment shown in fig. 2, and which may be applied in various electronic devices.
As shown in fig. 5, the apparatus 500 for displaying a real scene of the present embodiment may include a video acquisition unit 501, a position acquisition unit 502, and a display unit 503. Among them, the video acquisition unit 501 may be configured to acquire a video stream of a real scene. The location acquisition unit 502 may be configured to acquire location information of a user. The display unit 503 may be configured to display a video stream of a real scene in a display in a manner of being merged with an indoor scene based on the position information.
In this embodiment, the video acquiring unit 501 of the apparatus 500 for displaying a real scene may acquire a video stream of a real scene from a local or remote location (e.g., a cloud). Here, the video stream may be a video stream pre-stored in a local area, or may be a video stream updated in real time by the cloud. The real scene may include a real outdoor environment including, but not limited to, a mountain top scene, a beach scene, a scene outside a home window, and the like. The video stream of the real scene can be acquired by a 3D or 360-degree high-definition camera device installed in the real scene.
In this embodiment, the location acquiring unit 502 may acquire the location information of the user in the room (i.e., in the room). The position information of the user may include information such as a distance and an angle of the user with respect to the display.
In this embodiment, the display unit 503 may adjust the video stream of the real scene based on the position information acquired by the position acquisition unit 502, and then display the video stream of the real scene on the display in a manner of being merged with the indoor scene (i.e., the environment in the room). Merging with an indoor scene may refer to viewing a video stream of a real scene from the perspective of a user, as if it were embedded in the real scene, rather than feeling that the video is being played through a display or television. Here, the display may be mounted on an indoor wall for serving as a part of the indoor wall (e.g., a display embedded in the indoor wall to simulate a real window) or all (e.g., a full-face "display wall" tiled by displays). In order to realize the experience of being personally on the scene, the display has the characteristics of large size, high resolution ratio and the like. For example, a display with a resolution of 8K (a digital video standard, which has a resolution of 7680x 4320) or higher may be used.
Compared with the mode that the scenery outside the window is enjoyed through the window, the scheme provided by the embodiment can also enjoy outdoor scenes without the window, achieves the effect similar to a sea scene room, a mountain scene room and a French window room, and has no problem of scene color difference because the real scene can be selected. In addition, because a window (such as a French window) is not required to be arranged, the problem that personal privacy is revealed due to the fact that scenery outside the window is appreciated is avoided, and the requirement of daily life for guaranteeing privacy is met.
In some optional implementations of this embodiment, the display unit 503 may include a viewing angle determining module, a video converting module, and an overlay display module. Wherein the perspective determination module may be configured to determine a perspective from which the user views the real scene based on the location information. The video conversion module may be configured to perform perspective conversion on a video stream of a real scene so that the converted video stream coincides with the real scene viewed at a perspective. The overlay display module may be configured to display the converted video stream in an overlay manner with a preset display template in the display. Wherein, the real scene can be displayed in the area surrounded by the preset display template.
Optionally, the preset display template may include at least one of a window frame template, a curtain template, and a tv wall template.
In some optional implementations of this embodiment, the apparatus 500 may further include a live-action light determining unit and a light emitting unit. Wherein the real scene ray determination unit may be configured to determine the ray intensity and the ray angle of the real scene from a video stream of the real scene. The light emitting unit may be configured to emit light toward the indoor scene at a light intensity and a light angle.
In some optional implementations of this embodiment, the apparatus 500 may further include a light ray changing unit. Wherein the light ray changing unit may be configured to change the light ray intensity and the light ray angle based on a temporal change. Correspondingly, the light emitting unit may be further configured to emit light toward the indoor scene with the changed light intensity and the changed light angle.
In some optional implementations of this embodiment, the apparatus 500 may further include a viewing angle/depth adjustment unit. Wherein the viewing angle/depth adjustment unit may be configured to perform viewing angle adjustment and/or depth adjustment on the real scene displayed in the display based on the changed position information in response to detecting the position change of the user.
In some optional implementations of this embodiment, the apparatus 500 may further include an augmented reality unit. Wherein the augmented reality unit may be configured to render a real scene displayed in the display in response to detecting that the user is oriented towards the display, the real scene being displayed in augmented reality form.
In some alternative implementations of the present embodiment, the display area of the display may include a first area displaying a real scene and a second area displaying a window frame. Wherein the second region may surround the first region. Corresponding to this implementation, the apparatus 500 may further include an indoor light detection unit and a rendering unit. Wherein the indoor light detecting unit may be configured to detect an indoor light intensity. The rendering unit may be configured to render the second area based on the indoor light intensity.
In some optional implementations of this embodiment, the apparatus 500 may further include a window adjustment unit. Wherein the window adjusting unit may be configured to adjust the display of the window frame in response to detecting a limb action indicative of opening/closing the window, such that the window frame is in a state corresponding to the limb action.
In some optional implementation manners of the embodiment, the apparatus 500 may further include an audio acquiring unit, a sound effect determining unit, and an audio output unit. Wherein the audio acquiring unit may be configured to acquire an audio stream of the real scene. The acoustics determining unit may be configured to determine a volume and acoustics corresponding to the position information. The audio output unit may be configured to control a stereo speaker in the room to play an audio stream of the real scene with volume and effect.
In some optional implementations of this embodiment, the apparatus 500 may further include a sound effect adjusting unit. Wherein the sound effect adjusting unit may be configured to adjust the volume and sound effects of the stereo speaker based on the changed position information in response to detecting the position change of the user.
In some alternative implementations of the present embodiment, a virtual window is displayed in the display. Corresponding to this implementation, the sound effect adjusting unit may be further configured to adjust the volume and the effect of the stereo speaker to correspond to the changed on-off state of the virtual window in response to detecting the change in the on-off state of the virtual window.
According to the device provided by the embodiment of the disclosure, the video stream of the real scene and the position information of the user are acquired, and then the video stream of the real scene is displayed in the display in a manner of being fused with the indoor scene based on the position information, so that the display of the real scene and the indoor scene can be fused, and the effect of being placed in the real scene is realized.
Referring now to FIG. 6, a schematic diagram of an electronic device (e.g., host computer in FIG. 1) 600 suitable for use in implementing embodiments of the present disclosure is shown. The host shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing device (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage device 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following devices may be connected to the I/O interface 605 in general: an input device 606 including, for example, a keyboard, a mouse, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure. It should be noted that the computer readable medium in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a video stream of a real scene; acquiring position information of a user; based on the location information, a video stream of the real scene is displayed in the display in a manner merged with the indoor scene.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a video acquisition unit, a position acquisition unit, and a display unit. Where the names of these units do not in some cases constitute a limitation on the unit itself, for example, a video capture unit may also be described as a "unit that captures a video stream of a real scene".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the inventive concept as defined above. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Claims (13)
1. A method for displaying a real scene, comprising:
acquiring a video stream of a real scene;
acquiring position information of a user;
determining a viewing angle from which the user views a real scene based on the position information;
performing visual angle conversion on the video stream of the real scene to make the converted video stream consistent with the real scene observed at the visual angle;
displaying the converted video stream and a preset display template in the display in an overlapping manner, wherein a real scene is displayed in an area surrounded by the preset display template, and the preset display template comprises a window frame template;
detecting indoor light intensity, wherein a display area of the display comprises a first area displaying a real scene and a second area displaying a window frame, and the second area surrounds the first area;
and rendering the second area based on the indoor light intensity, the light intensity of the real scene and the light angle.
2. The method of claim 1, wherein the predetermined display templates further comprise at least one of a curtain template and a television wall template.
3. The method of claim 1, wherein the method further comprises:
determining the light intensity and the light angle of a real scene according to the video stream of the real scene;
and emitting light rays to the indoor scene according to the light ray intensity and the light ray angle.
4. The method of claim 3, wherein the method further comprises:
changing the light intensity and the light angle based on a time variation;
emitting light to the indoor scene at the changed light intensity and the changed light angle.
5. The method of claim 1, wherein the method further comprises:
in response to detecting the change in position of the user, performing a perspective adjustment and/or a depth adjustment of the real scene displayed in the display based on the changed position information.
6. The method of claim 1, wherein the method further comprises:
in response to detecting that the user is facing the display, rendering a real scene displayed in the display, the real scene being displayed in augmented reality.
7. The method of claim 1, wherein the method further comprises:
and adjusting the display of the window frame in response to the detection of the limb action indicating the window opening/closing, so that the window frame is in a state corresponding to the limb action.
8. The method according to any one of claims 1-7, wherein the method further comprises:
acquiring an audio stream of a real scene;
determining the volume and sound effect corresponding to the position information;
and controlling the indoor stereo sound box to play the audio stream of the real scene according to the volume and the sound effect.
9. The method of claim 8, wherein the method further comprises:
in response to detecting the change in the position of the user, adjusting the volume and sound effects of the stereo speaker based on the changed position information.
10. The method of claim 8, wherein a virtual window is displayed in the display; and
the method further comprises the following steps:
in response to detecting a change in the on-off state of the virtual window, adjusting the volume and the sound effect of the stereo speaker to correspond to the changed on-off state of the virtual window.
11. An apparatus for displaying a real scene, comprising:
a video acquisition unit configured to acquire a video stream of a real scene;
a location acquisition unit configured to acquire location information of a user;
a display unit configured to determine a viewing angle from which the user views a real scene based on the position information; performing visual angle conversion on the video stream of the real scene to make the converted video stream consistent with the real scene observed at the visual angle; displaying the converted video stream and a preset display template in the display in an overlapping manner, wherein a real scene is displayed in an area surrounded by the preset display template, and the preset display template comprises a window frame template;
an indoor light detecting unit configured to detect an indoor light intensity, a display area of the display including a first area displaying a real scene and a second area displaying a window frame, the second area surrounding the first area;
a rendering unit configured to render the second area based on the indoor light intensity, light intensity of a real scene, and light angle.
12. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method recited in any of claims 1-10.
13. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910015161.0A CN109741464B (en) | 2019-01-08 | 2019-01-08 | Method and apparatus for displaying real scenes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910015161.0A CN109741464B (en) | 2019-01-08 | 2019-01-08 | Method and apparatus for displaying real scenes |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109741464A CN109741464A (en) | 2019-05-10 |
CN109741464B true CN109741464B (en) | 2023-03-24 |
Family
ID=66363872
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910015161.0A Active CN109741464B (en) | 2019-01-08 | 2019-01-08 | Method and apparatus for displaying real scenes |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109741464B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110188482B (en) * | 2019-05-31 | 2022-06-21 | 魔门塔(苏州)科技有限公司 | Test scene creating method and device based on intelligent driving |
CN110751616B (en) * | 2019-10-16 | 2022-02-18 | 睿宇时空科技(重庆)有限公司 | Indoor and outdoor panoramic house-watching video fusion method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103813115A (en) * | 2014-01-27 | 2014-05-21 | 吴砺 | Simulated window |
CN106791385A (en) * | 2016-12-09 | 2017-05-31 | 深圳创维-Rgb电子有限公司 | A kind of view method, apparatus and system based on virtual reality technology |
CN106997618A (en) * | 2017-04-14 | 2017-08-01 | 陈柳华 | A kind of method that virtual reality is merged with real scene |
-
2019
- 2019-01-08 CN CN201910015161.0A patent/CN109741464B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103813115A (en) * | 2014-01-27 | 2014-05-21 | 吴砺 | Simulated window |
CN106791385A (en) * | 2016-12-09 | 2017-05-31 | 深圳创维-Rgb电子有限公司 | A kind of view method, apparatus and system based on virtual reality technology |
CN106997618A (en) * | 2017-04-14 | 2017-08-01 | 陈柳华 | A kind of method that virtual reality is merged with real scene |
Also Published As
Publication number | Publication date |
---|---|
CN109741464A (en) | 2019-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4594308B2 (en) | Visual content signal display device and method for displaying visual content signal | |
US7180529B2 (en) | Immersive image viewing system and method | |
TWI530157B (en) | Method and system for displaying multi-view images and non-transitory computer readable storage medium thereof | |
US20090100767A1 (en) | Audio-visual system | |
KR20130010424A (en) | Contents play method and apparatus | |
JP2007159135A (en) | A system for displaying video content and surround visual fields, a method for animating surround visual fields related to video displayed on a device, and surround visuals for controlling the projection and movement of surround visual fields Field controller | |
WO2014155670A1 (en) | Stereoscopic video processing device, stereoscopic video processing method, and stereoscopic video processing program | |
US20220129062A1 (en) | Projection Method, Medium and System for Immersive Contents | |
CN110928416A (en) | Immersive Scenario Interactive Experience Simulation System | |
CN109741464B (en) | Method and apparatus for displaying real scenes | |
WO2021095573A1 (en) | Information processing system, information processing method, and program | |
US20090153550A1 (en) | Virtual object rendering system and method | |
CN118338237A (en) | Method and apparatus for providing audio content in immersive reality | |
CN110730340A (en) | Lens transformation-based virtual auditorium display method, system and storage medium | |
KR20210056414A (en) | System for controlling audio-enabled connected devices in mixed reality environments | |
WO2016157996A1 (en) | Information processing device, information processing method, program, and image display system | |
US20230186552A1 (en) | System and method for virtualized environment | |
JP2008193605A (en) | Irradiation state control method | |
KR102165026B1 (en) | 360 degree augmented reality stereoscopic image experience system | |
Reinhuber et al. | The Scale of Immersion: Different audio-visual experiences exemplified by the 360 video Secret Detours | |
KR20220168858A (en) | Space virtualization system using virtual environment information and time information | |
CN218103295U (en) | Performance control system for composite theater | |
KR20220090251A (en) | Method and system for providing fusion of concert and exhibition based concexhibition service for art gallery | |
CN113577795A (en) | Stage visual space construction method | |
WO2021161894A1 (en) | Information processing system, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |