Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image projection method, an image projection apparatus, a mobile terminal, a projection device, and a non-transitory computer-readable storage medium.
According to a first aspect of embodiments of the present disclosure, an image screen projection method is provided and applied to a mobile terminal, and the method includes sending a screen projection instruction and an image to a screen projection device, enabling the screen projection device to display the image, receiving the image display instruction, responding to the image display instruction, changing a display mode of the image, and sending a corresponding display instruction to the screen projection device based on the image display instruction, so that the screen projection device correspondingly changes the display mode of the image.
In one embodiment, the image presentation instructions comprise image operation instructions, and the transforming the presentation of the image in response to the image presentation instructions comprises transforming the presentation of the image in at least one of rotating the image, scaling the image, moving the image, and video editing in response to the image operation instructions.
In an embodiment, the method comprises the steps of sending corresponding showing instructions to the screen throwing device based on the image showing instructions, wherein the showing instructions enable the screen throwing device to rotate the image to be shown at the same rotation angle if the transformation comprises rotating the image, enable the screen throwing device to zoom the image to be shown at the same zoom scale if the transformation comprises zooming the image, and enable the screen throwing device to move the image to be shown according to the moving distance of the image on the mobile terminal if the transformation comprises moving the image based on the display size ratio of the screen throwing device and the mobile terminal.
In an embodiment, the image display instruction comprises a slide show instruction, and the responding to the image display instruction changes the display mode of the image, and comprises the steps of obtaining an image set corresponding to the image, wherein the image set comprises a plurality of images to be displayed, and displaying the images to be displayed in the image set randomly or sequentially for a preset time.
In an embodiment, the method for transforming the display mode of the image in response to the image display instruction further comprises determining a designated point of the currently displayed image, and amplifying the currently displayed image by taking the designated point as a base point.
In one embodiment, the determining the designated point of the currently displayed image comprises performing target identification on the currently displayed image, taking a central point or a key point of the target as the designated point if the currently displayed image has the target, and determining the designated point according to a preset point or a random point if the currently displayed image does not have the target.
In an embodiment, the sending the corresponding display instruction to the screen throwing device based on the image display instruction includes sending the currently displayed image in the image set to the screen throwing device in real time, and enabling the screen throwing device to synchronously display the currently displayed image.
In an embodiment, the sending the corresponding display instruction to the screen throwing device based on the image display instruction includes sending the image set and playing information to the screen throwing device, so that the screen throwing device displays images to be displayed in the image set in synchronization with the mobile terminal, wherein the playing information at least includes the preset time and the playing sequence.
In one embodiment, the method further comprises the steps of sending the interface information to the screen throwing device to enable the screen throwing device to display all the content of the interface information if the current display content is the interface information, and executing the step of sending the screen throwing instruction and the image to the screen throwing device to enable the screen throwing device to display the image if the current display content is the details of the image.
According to a second aspect of the embodiment of the present disclosure, an image screen projection method is provided and applied to screen projection equipment, and the method includes receiving a screen projection instruction and an image sent by a mobile terminal, displaying the image in response to the screen projection instruction, receiving a display instruction sent by the mobile terminal, wherein the display instruction is generated based on the image display instruction received by the mobile terminal, and converting a display mode of the image in response to the display instruction, so that the display mode corresponds to a display mode converted by the mobile terminal based on the image display instruction.
In one embodiment, the responding to the screen throwing instruction displays the image, wherein the image is displayed in an enlarged mode based on the display size proportion of the mobile terminal and the screen throwing equipment.
In an embodiment, the presentation instructions include instructions for causing a presentation of an image to be transformed in at least one of rotating the image, scaling the image, and moving the image, the transforming the presentation of the image in response to the image presentation instructions including rotating the image at the same rotation angle if the transforming includes rotating the image, scaling the image presented at the same scaling scale if the transforming includes scaling the image, and moving the image presented based on a display size scale of the mobile terminal and the projection device according to a movement distance of the image at the mobile terminal if the transforming includes moving the image.
In an embodiment, the responding to the image display instruction changes the display mode of the image, and the method comprises the steps of receiving the currently displayed image sent by the mobile terminal in real time and synchronously displaying the currently displayed image.
In an embodiment, the response to the image display instruction changes the display mode of the image, and the method comprises the steps of receiving an image set and playing information sent by the mobile terminal, wherein the image set comprises a plurality of images to be displayed, the playing information at least comprises preset time and playing sequence, and displaying the images to be displayed in the image set according to the playing sequence in the preset time.
According to a third aspect of the embodiment of the disclosure, an image screen projection device is provided and applied to a mobile terminal, and the device comprises a sending unit, a first receiving unit, a first processing unit and a sending unit, wherein the sending unit is used for sending a screen projection instruction and an image to screen projection equipment to enable the screen projection equipment to display the image, the first receiving unit is used for receiving the image display instruction, the first processing unit is used for responding to the image display instruction to change the display mode of the image, and the sending unit is further used for sending a corresponding display instruction to the screen projection equipment based on the image display instruction to enable the screen projection equipment to correspondingly change the display mode of the image.
In an embodiment, the image presentation instructions comprise image operation instructions, and the first processing unit is further configured to, in response to the image operation instructions, cause a presentation of an image to be transformed in at least one of rotating the image, scaling the image, moving the image, and video editing.
In an embodiment, the sending unit is configured to, if the transformation includes rotating the image, cause the screen projection device to rotate the image displayed at the same rotation angle, if the transformation includes scaling the image, cause the screen projection device to scale the image displayed at the same scaling, and if the transformation includes moving the image, cause the screen projection device to move the image displayed based on a display size ratio of the screen projection device and the mobile terminal according to a movement distance of the image in the mobile terminal.
In an embodiment, the image display instruction comprises a slide show instruction, and the first processing unit is further configured to obtain an image set corresponding to the image, wherein the image set comprises a plurality of images to be displayed, and display the images to be displayed in the image set randomly or sequentially for a preset time.
In an embodiment, the first processing unit is further configured to determine a designated point of the currently displayed image, and enlarge the currently displayed image with the designated point as a base point.
In an embodiment, the first processing unit is further configured to perform object recognition on the currently displayed image, take a center point or a key point of the object as the designated point if the currently displayed image has an object, and determine the designated point according to a preset point or a random point if the currently displayed image has no object.
In an embodiment, the sending unit is further configured to send, in real time, the currently displayed image in the image set to the screen-projection device, and enable the screen-projection device to synchronously display the currently displayed image.
In an embodiment, the sending unit is further configured to send the image set and the playing information to the screen throwing device, so that the screen throwing device displays the images to be displayed in the image set in synchronization with the mobile terminal, where the playing information at least includes the preset time and the playing sequence.
In an embodiment, when the current display content is the interface information, the sending unit sends the interface information to the screen throwing device to enable the screen throwing device to display all the content of the interface information, and when the current display content is the details of the image, the sending unit sends a screen throwing instruction and the image to the screen throwing device to enable the screen throwing device to display the image.
According to a fourth aspect of the embodiment of the present disclosure, an image screen projection device is provided, and the device is applied to a screen projection device, and the device comprises a second receiving unit, a second processing unit, and a second receiving unit, wherein the second receiving unit is used for receiving a screen projection instruction and an image sent by a mobile terminal, the second processing unit is used for responding to the screen projection instruction and displaying the image, the second receiving unit is also used for receiving a display instruction sent by the mobile terminal, the display instruction is generated based on the image display instruction received by the mobile terminal, and the second processing unit is also used for responding to the display instruction and converting a display mode of the image, so that the display mode corresponds to a display mode converted by the mobile terminal based on the image display instruction.
In an embodiment, the second processing unit is further configured to enlarge and display the image based on a display size ratio of the mobile terminal and the screen-throwing device.
In an embodiment, the presentation instructions comprise instructions for transforming the presentation of the image in at least one of rotating the image, scaling the image, moving the image, the second processing unit further being configured to rotate the presented image at the same rotation angle if the transforming comprises rotating the image, scale the presented image at the same scaling scale if the transforming comprises scaling the image, and move the presented image based on the display size scale of the mobile terminal and the screen throwing device according to the movement distance of the image at the mobile terminal if the transforming comprises moving the image.
In an embodiment, the second receiving unit is further configured to receive, in real time, a currently displayed image sent by the mobile terminal, and the second processing unit is further configured to synchronously display the currently displayed image.
In an embodiment, the second receiving unit is further configured to receive an image set and playing information sent by the mobile terminal, where the image set includes a plurality of images to be displayed, and the playing information includes at least a preset time and a playing order, and the second processing unit is further configured to display the images to be displayed in the image set according to the playing order and the preset time.
According to a fifth aspect of embodiments of the present disclosure, there is provided a mobile terminal comprising a processor, a memory for storing processor executable instructions, wherein the processor is configured to perform the image projection method as described in the foregoing first aspect.
According to a sixth aspect of embodiments of the present disclosure, there is provided a screen projection device comprising a processor, a memory for storing processor executable instructions, wherein the processor is configured to perform the image projection method according to the second aspect described above.
According to a seventh aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium, which when executed by a mobile processor, implements the image projection method as described in the foregoing first aspect.
According to an eighth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium, which when executed by a mobile processor, implements the image projection method as described in the foregoing second aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects that in the screen projection process, the display mode of the image in the mobile terminal can be changed, and the display can be synchronously changed in the screen projection equipment, so that browsing can be conveniently performed, various forms of display and sharing can be performed, different requirements of users can be met, and the user experience can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
To solve the problems, the disclosure provides an image projection method 10, fig. 1 is a schematic flow chart of an image projection method 10 according to an exemplary embodiment, the image projection method 10 may be applied to a mobile terminal such as a mobile phone, a tablet computer, etc., the image projection method 10 includes steps S11-S14, and the following details are given.
In step S11, a screen-throwing instruction and an image are sent to the screen-throwing device, so that the screen-throwing device displays the image.
In the embodiment of the disclosure, the screen projection device may be a device capable of displaying images in a large size, such as a television, a display or a projection device, and may establish near field connection with a mobile terminal, for example, may be connected through wifi or may be connected through bluetooth. After the connection is established, the mobile terminal can send an image needing to be projected to the projection equipment, and the projection equipment displays the image based on the projection instruction.
Since the display size of the mobile terminal is different from the display size of the screen-throwing device, an image displayed on the full screen of the mobile terminal needs to be enlarged and displayed when the full screen display is performed by the screen-throwing device. In some embodiments, the screen projection device may enlarge the image based on a display size ratio of the mobile terminal to the screen projection device. For example, the longitudinal length of the display screen of the mobile terminal is 15cm, the transverse length is 8cm, the longitudinal display size of the screen throwing device is 90cm, the transverse display size is 160cm, and the image is enlarged as magnification factor according to the minimum value in the size proportion of the two directions, so that the image can be ensured to be fully supported on the screen in one direction. The screen throwing device can also display the image in full screen according to the relation between the size of the image and the display size of the screen throwing device after the image is acquired.
In step S12, an image presentation instruction is received.
In the embodiment of the disclosure, a user can change the display modes of images displayed in the mobile terminal and the screen throwing equipment according to the needs. The user can control the terminal equipment to enable the mobile terminal to receive the image display instruction, and can execute a preset instruction by clicking. For example, corresponding image display instructions may be generated through different gestures on a touch screen of the mobile terminal, for example, two fingers keep pressing and move in opposite directions or in opposite directions to achieve reduction or amplification of an image, finger swipe achieves movement of the image, two fingers keep pressing and two fingers achieve relative circular motion to achieve rotation of the image, and various control modes are available and are not limited herein.
In step S13, in response to the image presentation instruction, the presentation style of the image is changed.
After receiving the image display instruction, the mobile terminal changes the display mode of the image according to the instruction. Changes to the image presentation may include zooming, selecting, moving, displaying multiple sheets in succession, adding special effects, and so forth.
In step S14, based on the image display instruction, a corresponding display instruction is sent to the screen projection device, so that the screen projection device transforms the display mode of the image correspondingly.
In the embodiment of the disclosure, after the user transforms the image displayed in the mobile terminal, the mobile terminal sends the corresponding display instruction to the screen projection device, so that the screen projection device can perform the same transformation on the displayed image, and the two images are synchronous. Therefore, the user can adjust the image displayed by the mobile terminal, so that the consistent synchronous adjustment of the image displayed by the screen throwing equipment can be realized, the screen throwing function is enriched, and the requirement of more users is met.
In some embodiments, after the mobile terminal and the screen throwing device are connected, the screen throwing mode can be switched automatically or manually according to the current application scene of the mobile terminal, if the current display content of the mobile terminal is interface information, such as displaying a mobile phone desktop, an album browsing interface and the like, the current interface information can be sent to the screen throwing device, so that the screen throwing device displays the whole content of the interface information, namely, the screen throwing is performed by adopting a mirror image screen throwing, and the screen throwing device completely displays the content displayed by the mobile terminal, thereby being convenient for users to use without observing the screen of the mobile terminal, and completing corresponding operation in the display area of the screen throwing device, and avoiding inconvenience caused by repeated conversion of vision by users. If the current display content is details of the image, for example, a certain image is selected in the album, and single-image display is performed, step S11 may be performed, so that the screen projection device displays the image. The screen-throwing mode performed in the step S11 may be a DLNA (DIGITAL LIVING NETWORK ALLIANCE) mode, and in the step S11, the transmitted screen-throwing instruction may be a protocol between the mobile device and the screen-throwing device is switched or expanded, so that another screen-throwing mode is implemented, more various display images can be displayed, more abundant effects can be displayed, and more demands of users can be implemented.
In an embodiment, the image presentation instructions comprise image operation instructions, and step S13 may comprise, in response to the image operation instructions, causing the presentation of the image to be transformed in at least one of a rotation image, a scaling image, a moving image, and a video editing. In this embodiment, the user operates the currently displayed image in the mobile terminal, changes the display mode of the currently displayed image, and synchronously, correspondingly changes the display mode in the screen-throwing device synchronously, so as to keep consistent with the transformed image displayed by the mobile terminal, and meet different requirements of the user. The images described in the embodiments of the present disclosure are not limited to the still picture format, such as jpg, jpeg, png, but also include the dynamic format, such as gif format, and may also include the video format, such as mp4, rmvb format, and so on. The user may change the image display mode by one of the above modes or may change the image display mode by a combination of a plurality of modes, for example, the image is enlarged and then moved and rotated by a certain angle. The video editing may include an operation of editing an image in a video format, and may also include an operation of adjusting a play mode of the video, such as a play speed change, a fast forward and fast backward play, and the like.
Based on the above embodiments, in some embodiments, the screen projection device may perform corresponding transformation according to different transformation modes, and step S14 may include, if the transformation includes rotating the image, displaying an instruction to make the screen projection device rotate the displayed image at the same rotation angle, if the transformation includes scaling the image, displaying an instruction to make the screen projection device scale the displayed image at the same scaling, and if the transformation includes moving the image, displaying an instruction to make the screen projection device move the displayed image according to a movement distance of the image at the mobile terminal based on a display size scale of the screen projection device and the mobile terminal. In this embodiment, when the rotation transformation of the image is performed, the rotation angle may be acquired in real time, and the rotation may be performed in real time in the screen projection device based on the rotation angle, so as to maintain the synchronous transformation. When the scaling transformation of the image is carried out, the scaling ratio can be obtained in real time, so that the currently displayed image is scaled in the same ratio in the screen projection equipment. When the image is moved, the display size ratio of the screen throwing device to the mobile terminal can be based on the ratio of the screen throwing device to the mobile terminal, for example, the ratio of the mobile terminal to the screen throwing device is enlarged, the image original image size is dependent on the ratio of the full screen display of the mobile terminal display screen to the full screen display of the screen throwing device, for example, the original image is fully supported on the mobile terminal and needs to be enlarged by 2 times, the original image is fully supported on the screen throwing device and needs to be enlarged by 3 times, the ratio of the original image to the screen throwing device is 2:3, at this time, 2 pixel points are moved on the mobile terminal, and 3 pixel points are required to be moved on the screen throwing device.
Through the embodiment, the conversion carried out by the user in the currently displayed image is synchronous between the mobile terminal and the screen projection equipment, so that the image can be conveniently displayed and shared, and different display requirements of the user are met.
In other embodiments, the manner in which the multiple images are presented in the mobile terminal may be set or changed. As shown in FIG. 2, the image display instruction includes a slide show instruction, and step S13 may include step S131 of acquiring an image set corresponding to the image, where the image set includes a plurality of images to be displayed, and step S132 of randomly or sequentially displaying the images to be displayed in the image set for a preset time. In this embodiment, there may be an image set composed of a plurality of images, for example, all photos, or an image folder set according to a user, etc. in the mobile terminal. The user can play and show the images of the image set in turn in a slide show mode. The slide show instruction may be generated based on the user clicking on the corresponding command key. The preset time is the time for displaying a single image and can be 5 seconds or set according to a user instruction. The playing order can be randomly generated, or can be determined according to the attribute of the image, such as the acquisition order, the size order and the like, and the images are displayed in sequence. After the user performs a corresponding operation on the mobile terminal to display in a slide manner, the screen-throwing device is also synchronously displayed in a slide manner and is synchronized with the images displayed in the mobile terminal. In some embodiments, in performing slide show on images, various forms of switching effects may be employed in performing the switching composition of the images, such as a current show image fades out, a next show image fades in, a current show image flies out, a next show image flies in, and so on. By increasing the image switching effect in the slide show process, the experience of the slide show is increased, so that the show content is vivid and has more ornamental value.
In one embodiment, step S14 may include sending the currently displayed image in the image set to the screen display device in real time, and causing the screen display device to synchronously display the currently displayed image. In the process of slide playing, the mobile terminal can send the currently displayed image to the screen throwing equipment in real time, so that the screen throwing equipment synchronously displays the images, and the screen throwing equipment realizes the images in a slide mode in a display image set. And a plurality of images can be sent to the screen throwing equipment in advance according to the playing sequence, so that display delay between the mobile terminal and the screen throwing equipment caused by real-time transmission is reduced.
In another embodiment, step S14 may include sending an image set and playing information to the screen capturing device, so that the image to be displayed in the image set displayed by the screen capturing device is synchronized with the mobile terminal, where the playing information includes at least a preset time and a playing sequence. In this embodiment, the image set to be played in the slide manner may be sent to the screen capturing device, and at the same time, the preset time and the playing sequence may also be sent to the screen capturing device, where the screen capturing device may perform slide playing on the images in the image set according to the playing sequence, where the playing sequence may be randomly generated by the mobile terminal, or may be generated according to the attribute of the images. Because the preset time and the play sequence are the same as those of the mobile terminal, synchronous play of the mobile terminal and the mobile terminal can be realized.
Some display effect can be added to the displayed image when the slide show is performed. In some embodiments, step S13 may further include determining a designated point of the currently displayed image, and enlarging the currently displayed image with the designated point as a base point. In this example, some effects are added to the currently displayed image during the display of the slide, making the slide more artistic. In this embodiment, the designated point may be a preset point, such as any point in four corners of the image, or may be a random point determined randomly. The current displayed image is amplified by taking the designated point as a base point, and the designated point can be slowly moved to the center of the display area, so that the movement of the image is realized, and the image is gradually amplified.
In other embodiments, the designated point of the currently displayed image may be determined by performing object recognition on the currently displayed image, taking a center point or a key point of the object as the designated point if the currently displayed image has the object, and determining the designated point according to a preset point or a random point if the currently displayed image does not have the object. In this embodiment, the specified point can be determined based on the specific content of the currently displayed image, so that the added slide effect more highlights the image content. The embodiment can determine whether a specific target exists in the currently displayed image, such as a person, a mountain peak and the like, through the target recognition model. If the target is identified, the center point of the target may be taken as a specified point, or any key point of the target may be taken as a specified point, for example, if a person is identified as being present in the image, the center pixel point of the person may be taken as a specified point, or a key point such as a determined nose tip may be taken as a specified point. By the method, the slide show can be played with more visual effect. In addition, in the screen projection equipment, synchronous display can be performed with the mobile terminal, the mobile terminal can send the determined designated point to the screen projection equipment, and the projection equipment can perform the same amplification processing according to the designated point, so that synchronous display of the two can be realized.
Based on the same inventive concept, the embodiment of the disclosure further provides an image screen projection method 20 correspondingly, as shown in fig. 3, the image screen projection method 20 can be applied to any one of the screen projection devices, so that the image can be synchronously displayed with the mobile terminal, the display mode change of the user on the mobile terminal can be correspondingly synchronously displayed, and the operation of the user on the screen projection device is met. The image screen projection method 20 comprises the steps of receiving a screen projection instruction and an image sent by a mobile terminal, displaying the image in response to the screen projection instruction, receiving a display instruction sent by the mobile terminal, generating the display instruction based on the image display instruction received by the mobile terminal, and converting the display mode of the image in response to the display instruction, so that the display mode corresponds to the display mode converted by the mobile terminal based on the image display instruction, wherein the step S21 is used for receiving the screen projection instruction and the image sent by the mobile terminal, and the step S23 is used for receiving the display instruction sent by the mobile terminal.
In one embodiment, step S22 may include magnifying the image based on the display size ratio of the mobile terminal to the screen-throwing device.
In an embodiment, the presentation instructions include instructions for causing the presentation of the image to be transformed in at least one of a rotation image, a scaling image, and a movement image, and step S24 may include rotating the presented image at the same rotation angle if the transformation includes the rotation image, scaling the presented image at the same scaling scale if the transformation includes the scaling image, and moving the presented image based on the display size scale of the mobile terminal and the screen throwing device according to the movement distance of the image at the mobile terminal if the transformation includes the movement image.
In one embodiment, step S24 may include receiving, in real time, the currently displayed image sent by the mobile terminal, and synchronously displaying the currently displayed image.
In an embodiment, step S24 may include receiving an image set and playing information sent by the mobile terminal, where the image set includes a plurality of images to be displayed, and the playing information includes at least a preset time and a playing order, and displaying the images to be displayed in the image set according to the playing order in the preset time.
The specific manner and technical effects of the steps of the image projection method 20 in the above embodiment have been described in the foregoing embodiment of the image projection method 10, and will not be described in detail herein.
Based on the same conception, the present disclosure embodiment also provides the image projection apparatus 100 and the image projection apparatus 200.
It will be appreciated that, in order to implement the above-described functions, the image processing apparatus provided in the embodiments of the present disclosure includes corresponding hardware structures and/or software modules that perform the respective functions. The disclosed embodiments may be implemented in hardware or a combination of hardware and computer software, in combination with the various example elements and algorithm steps disclosed in the embodiments of the disclosure. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present disclosure.
Fig. 4 is a block diagram of an image projection device 100, according to an exemplary embodiment. Referring to fig. 4, the image projection device 100 is applied to a mobile terminal, and the image projection device 100 comprises a sending unit 110 for sending a projection instruction and an image to a projection device so as to enable the projection device to display the image, a first receiving unit 120 for receiving the image display instruction, a first processing unit 130 for responding to the image display instruction and converting the display mode of the image, and a sending unit 110 for sending a corresponding display instruction to the projection device based on the image display instruction so as to enable the projection device to correspondingly convert the display mode of the image.
In an embodiment the image presentation instructions comprise image manipulation instructions and the first processing unit 130 is further arranged to cause the presentation of the image to be transformed in at least one of a rotation image, a scaling image, a moving image, a video editing in response to the image manipulation instructions.
In an embodiment, the sending unit 110 is configured to, if the transformation includes a rotation image, cause the screen projection device to rotate the displayed image by the same rotation angle, if the transformation includes a scaling image, cause the screen projection device to scale the displayed image by the same scaling ratio, and if the transformation includes a moving image, cause the screen projection device to move the displayed image based on a display size ratio of the screen projection device and the mobile terminal according to a movement distance of the image in the mobile terminal.
In an embodiment, the image displaying instruction includes a slide show instruction, and the first processing unit 130 is further configured to obtain an image set corresponding to the image, where the image set includes a plurality of images to be displayed, and randomly or sequentially display the images to be displayed in the image set for a preset time.
In an embodiment, the first processing unit 130 is further configured to determine a designated point of the currently displayed image, and enlarge the currently displayed image with the designated point as a base point.
In an embodiment, the first processing unit 130 is further configured to perform object recognition on the currently displayed image, take a center point or a key point of the object as the specified point if the currently displayed image has the object, and determine the specified point according to a preset point or a random point if the currently displayed image has no object.
In an embodiment, the sending unit 110 is further configured to send, in real time, the currently displayed image in the image set to the screen capturing device, and enable the screen capturing device to synchronously display the currently displayed image.
In an embodiment, the sending unit 110 is further configured to send the image set and the playing information to the screen capturing device, so that the image to be displayed in the image set displayed by the screen capturing device is synchronized with the mobile terminal, where the playing information at least includes a preset time and a playing sequence.
In one embodiment, when the current display content is the interface information, the sending unit 110 sends the interface information to the screen throwing device, so that the screen throwing device displays the whole content of the interface information, and when the current display content is the details of the image, the sending unit 110 sends the screen throwing instruction and the image to the screen throwing device, so that the screen throwing device displays the image.
With respect to the image projection apparatus 100 in the above-described embodiment, the specific manner in which the respective modules perform the operations has been described in detail in the embodiment regarding the method, and will not be described in detail here.
Fig. 5 is a block diagram illustrating an image projection device 200 according to an exemplary embodiment. Referring to fig. 5, the image screen projection device 200 is applied to a screen projection device, and the image screen projection device 200 comprises a second receiving unit 210, a second processing unit 220, a second receiving unit 210 and a second processing unit 220, wherein the second receiving unit 210 is used for receiving a screen projection instruction and an image sent by a mobile terminal, the second processing unit 220 is used for responding to the screen projection instruction and displaying the image, the second receiving unit 210 is also used for receiving a display instruction sent by the mobile terminal, the display instruction is generated based on the image display instruction received by the mobile terminal, and the second processing unit 220 is also used for responding to the display instruction and converting the display mode of the image so that the display mode corresponds to the display mode of the mobile terminal based on the image display instruction conversion.
In an embodiment, the second processing unit 220 is further configured to enlarge and display the image based on a display size ratio of the mobile terminal and the screen-throwing device.
In an embodiment the presentation instructions comprise instructions for causing the presentation of the image to be transformed in at least one of a rotation image, a scaling image, a movement image, the second processing unit 220 further being adapted to rotate the presented image at the same rotation angle if the transformation comprises a rotation image, to scale the presented image at the same scaling scale if the transformation comprises a scaling image, and to move the presented image based on the display size scale of the mobile terminal and the screen throwing device according to the movement distance of the image at the mobile terminal if the transformation comprises a movement image.
In an embodiment, the second receiving unit 210 is further configured to receive the currently displayed image sent by the mobile terminal in real time, and the second processing unit 220 is further configured to synchronously display the currently displayed image.
In an embodiment, the second receiving unit 210 is further configured to receive an image set and playing information sent by the mobile terminal, where the image set includes a plurality of images to be displayed, and the playing information includes at least a preset time and a playing order, and the second processing unit 220 is further configured to display the images to be displayed in the image set according to the playing order at the preset time.
With respect to the image projection apparatus 200 in the above-described embodiment, the specific manner in which the respective modules perform the operations has been described in detail in the embodiment regarding the method, and will not be described in detail herein.
Fig. 6 is a block diagram illustrating an apparatus for image projection according to an exemplary embodiment. For example, apparatus 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to FIG. 6, the apparatus 800 may include one or more of a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the apparatus 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on the device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 806 provides power to the various components of the device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen between the device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 800 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to, a home button, a volume button, an activate button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800. For example, the sensor assembly 814 may detect an on/off state of the device 800, a relative positioning of the components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, an orientation or acceleration/deceleration of the device 800, and a change in temperature of the device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the apparatus 800 and other devices, either in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of apparatus 800 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Fig. 7 is a block diagram illustrating an apparatus 1100 for image processing according to an exemplary embodiment. For example, apparatus 1100 may be provided as a server. Referring to FIG. 7, apparatus 1100 includes a processing component 1122 that further includes one or more processors and memory resources, represented by memory 1132, for storing instructions, such as application programs, executable by processing component 1122. The application programs stored in memory 1132 may include one or more modules each corresponding to a set of instructions. Further, processing component 1122 is configured to execute instructions to perform the above-described method lithium battery activated charging method
The apparatus 1100 may also include a power component 1126 configured to perform power management of the apparatus 1100, a wired or wireless network interface 1150 configured to connect the apparatus 1100 to a network, and an input-output (I/O) interface 1158. The device 1100 may operate based on an operating system stored in the memory 1132, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
It is understood that the term "plurality" in this disclosure means two or more, and other adjectives are similar thereto. "and/or" describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate that there are three cases of a alone, a and B together, and B alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is further understood that the terms "first," "second," and the like are used to describe various information, but such information should not be limited to these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the expressions "first", "second", etc. may be used entirely interchangeably. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that the terms "center," "longitudinal," "transverse," "front," "rear," "upper," "lower," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like, as used herein, refer to an orientation or positional relationship based on that shown in the drawings, merely for convenience in describing the present embodiments and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operate in a particular orientation.
It will be further understood that "connected" includes both direct connection where no other member is present and indirect connection where other element is present, unless specifically stated otherwise.
It will be further understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.