CN104571512A - Method and device for assisting multiple projection ends in projection - Google Patents
Method and device for assisting multiple projection ends in projection Download PDFInfo
- Publication number
- CN104571512A CN104571512A CN201410842843.6A CN201410842843A CN104571512A CN 104571512 A CN104571512 A CN 104571512A CN 201410842843 A CN201410842843 A CN 201410842843A CN 104571512 A CN104571512 A CN 104571512A
- Authority
- CN
- China
- Prior art keywords
- information
- scanning
- projection
- treating apparatus
- output
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1407—General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The invention provides a method and a device for assisting multiple projection ends in projection. The method includes the following steps: acquiring to-be-output information; determining part output information corresponding to one or multiple of the projection ends in the to-be-output information respectively according to the to-be-output information; transmitting corresponding part output information to one or multiple projection ends to enable each projection end to project on the basis of the corresponding part output information.
Description
Technical field
The present invention relates to field of computer technology, particularly relate to a kind of method and apparatus of assisting multiple end that projects to carry out projecting.
Background technology
In prior art, when projecting, generally projector equipment being connected to other equipment, exporting corresponding image for projector equipment based on the information to be output from other equipment.Such as, common projector is connected to notebook computer, thus the interface content of this notebook computer that projects on curtain.Need to use multiple equipment to realize, significantly limit occasion and mode that user carries out projecting.
Summary of the invention
The object of this invention is to provide a kind of method and apparatus of assisting multiple end that projects to carry out projecting.
According to an aspect of the present invention, provide a kind of method of assisting multiple end that projects to carry out projecting, wherein said multiple projection end is corresponding with treating apparatus, wherein, said method comprising the steps of:
A obtains information to be output;
B, according to described information to be output, determines in described information to be output respectively, with described multiple one or more part output informations corresponding to holding respectively that project in holding that project;
C transmits corresponding part output information respectively to described one or more projection end, projects in the part output information corresponding with self for each projection end group.
According to an aspect of the present invention, additionally provide a kind for the treatment of apparatus of assisting multiple end that projects to carry out projecting, wherein said multiple projection end is corresponding with treating apparatus, and wherein, described treating apparatus comprises:
Acquisition device, for obtaining information to be output;
Determining device, for according to described information to be output, determines in described information to be output respectively, with described multiple one or more part output informations corresponding to holding respectively that project in holding that project;
Transmitting device, for transmitting corresponding part output information respectively to described one or more projection end, projects in the part output information corresponding with self for each projection end group.
Compared with prior art, the present invention has the following advantages: can process the information to be output for projecting, the partial information projected respectively in information to be output is held by the multiple projections being contained in wearable device, and then complete the projection of a part or whole part of this information to be output is exported, make user can have more excellent visual experience.And, scanning information can be transmitted between wearable device according to the solution of the present invention, make user can based on the scanning information from other users received, project the virtual image corresponding with these other users, make two users can carry out interaction based on the virtual image of the other side respectively, improve Consumer's Experience.
Accompanying drawing explanation
By reading the detailed description done non-limiting example done with reference to the following drawings, other features, objects and advantages of the present invention will become more obvious:
Fig. 1 illustrates according to a kind of method flow diagram of assisting multiple end that projects to carry out projecting of the present invention;
Fig. 2 illustrates according to of the present invention a kind of for assisting multiple projection end to carry out the structural representation of the treating apparatus projected.
In accompanying drawing, same or analogous Reference numeral represents same or analogous parts.
Embodiment
Below in conjunction with accompanying drawing, the present invention is described in further detail.
Fig. 1 illustrates according to a kind of method flow diagram of assisting multiple end that projects to carry out projecting of the present invention.Method according to the present invention comprises step S1, step S2 and step S3.
Wherein, method according to the present invention realizes for carrying out treating apparatus that project, wearable device by being contained in.
Wherein, described wearable device include but not limited to various wearable with on human body, comprise multiple equipment held that projects.Such as, the clothes, pieces of ornament accessories etc. of multiple projection ends of described treating apparatus and correspondence thereof are comprised.
Wherein, described treating apparatus comprise a kind of can according in advance setting or the instruction stored, automatically carry out the electronic equipment of numerical evaluation and/or information processing, its hardware includes but not limited to microprocessor, special IC (ASIC), programmable gate array (FPGA), digital processing unit (DSP), embedded device etc.Preferably, described treating apparatus can perform and include but not limited to such as Iamge Segmentation, image processing and tracking unit, 3D effect etc. image processing operations.
Wherein, described projection end comprises the various device carrying out projecting.Preferably, described projection end comprises the holographic projector that can carry out air projection.Preferably, described projection end comprises the miniature holographic projector that can be distributed on described wearable device surface.
Preferably, described treating apparatus can be connected to each projection end that described wearable device comprises, to transmit the output information for projecting to it.
Preferably, described wearable device also comprises multiple scanning end.Described scanning end comprises the equipment that can be used to obtain the scans content information comprising the such as visual information such as image information or video information.Such as, miniature video camera is first-class.
More preferably, described scanning end also comprises the equipment that can be used for obtaining the scanning position information corresponding with himself.Such as, for obtaining the microsensor etc. of three-dimensional coordinate.
Preferably, data transmission can be carried out by network and other according between wearable device of the present invention according to wearable device of the present invention.
Wherein, the network residing for described wearable device includes but not limited to internet, wide area network, Metropolitan Area Network (MAN), LAN (Local Area Network), VPN etc.
It should be noted that; described wearable device, projection end, scanning end and network are only citing; other wearable devices that are existing or that may occur from now on, projection end, scanning end and network are as being applicable to the present invention; also within scope should being included in, and this is contained in way of reference.
With reference to Fig. 1, in step sl, treating apparatus obtains information to be output.
Wherein, described information to be output includes but not limited to following any one information:
1) view relevant information; This view relevant information comprise from other equipment, can be used for the information of carrying out projecting, such as, the image file in smart mobile phone, video file or mobile phone interface image etc.
2) scanning information.Wherein, this scanning information comprises the scans content information from the one or more scanning ends corresponding with described treating apparatus, such as, and the scan image information corresponding with himself that scanning end obtains or scan video information.
Preferably, this scanning information also can comprise the scanning position information corresponding with this scans content information.Such as, three-dimensional coordinate information, two-dimensional coordinate information etc.
Then, in step s 2, treating apparatus, according to described information to be output, is determined in described information to be output respectively, with described multiple one or more part output informations corresponding to holding respectively that project in holding that project.
Wherein, treating apparatus, according to described information to be output, is determined in described information to be output respectively, with described multiple project hold in one or more project hold the mode of respectively corresponding part output informations include but not limited to following any one:
1) when described information to be output comprises view relevant information, treating apparatus is first determined in described multiple projection end, for one or more projection ends of this information to be output that projects.Then, described information to be output, based on predetermined division rule, is divided into and holds part output information corresponding respectively with described one or more projection by treating apparatus.
Preferably, described multiple projection end corresponds respectively to multiple regions of described wearable device.
Particularly, treating apparatus can according to described view relevant information, the quantity of the projection end of this information to be output that determines projecting and each regional location residing for projection end.Then, described view relevant information, based on predetermined division rule, is divided into multiple part output information and various piece output information is distributed to each projection end determined by treating apparatus.
Preferably, treating apparatus also can determine the quantity of required projection end based on the attribute of described information to be output itself.
Such as, when information to be output is image information, based on the pixel height of image information, image information can be divided into " high-definition image " and " standard picture ", and determine that the more projection end of usage quantity projects " high-definition image ", and the less projection end of usage quantity projects " standard picture ".
According to the first example of invention, wearable device is an intelligent vest, and this intelligent vest comprises a treating apparatus, and 50 miniature holographic projector being uniformly distributed in vest front and side.Further, this intelligent vest is connected by the smart mobile phone of network with user A.When the treating apparatus in this intelligent vest gets the picture image_1 from this smart mobile phone in step sl, treating apparatus is based on the Pixel Information of this image_1, judge that this picture image_1 is high definition picture, and determine that all 50 the miniature holographic projector using this intelligent vest to comprise are to this picture image_1 that projects further.Then, treating apparatus is based on predetermined division rule, by performing image segmentation operations, this picture image_1 is divided into 50 part output informations according to area, and press from left to right, these 50 part output informations are not corresponded to 50 the miniature holographic projector comprised in this intelligent vest by order from top to bottom successively.
2) when described information to be output comprises one or more scanning information, described step S2 comprises step S201 (not shown) and step S202 (not shown).
In step s 201, at least one the scanning information for projecting selected by treating apparatus from described one or more scanning information.
Particularly, treating apparatus can be selected based on user, or, based on the content selection mechanism preset, from described one or more scanning information, select at least one the scanning information for projecting.
Preferably, described scans content packets of information containing scanning end identification information, the scanning end that treating apparatus can be determined corresponding to this scans content information based on this identification information.
More preferably, described scanning end identification information comprises the numbering of each scanning end corresponding with described treating apparatus.
Preferably, scanning projection pattern first determined by treating apparatus, then, based on determined scanning projection pattern, by least one the scanning information selected in described one or more scanning information for projecting.
Wherein, described scanning projection pattern comprises and can arrange for carrying out the pattern projected.Such as, when scanning information corresponds to multiple position of user's body, scanning projection pattern can comprise front/back/lateral mode, projects in the scanning information corresponding with corresponding body part for projection end group.
Then, in step S202, during treating apparatus is held by described multiple projection, select at least one corresponding with the every scanning information difference in described at least one scanning information to project to hold, using the part output information that this every scanning information is held as each projection corresponding at least one projection end.
According to the second example of the present invention, user A wears the wearable device Dev_1 of a clothes form, wherein, this wearable device Dev_1 comprises treating apparatus Proc_1, this treating apparatus Proc_1 and 50 the miniature holographic projector being uniformly distributed in this clothes front and lateral side regions, and it is corresponding with 100 minisize pick-up heads of rear surface regions to be uniformly distributed in this clothes front, side, wherein each projector is in order from p_1 numbering to p_50, and each minisize pick-up head is in order from c_1 numbering to c_100; User B wears the wearable device Dev_2 of style identical with Dev_1, this wearable device comprises treating apparatus Proc_2, and the miniature holographic projector p_1 to p_50 corresponding with this treating apparatus Proc_2, and the minisize pick-up head c1 to c100 corresponding with Proc_2.Wherein, on the distribution mode of Dev_2 each projector upper and camera and Dev_1, the distribution mode of each projector and camera is identical.
Wherein, Dev_1 comprises the associative mode of two kinds of predetermined projector and camera: projector p_x corresponding camera c_x respectively in associative mode " face model ", projector p_x corresponding camera c_ (50+x) respectively in associative mode " face model "; Wherein x is the integer of span in [1,50].
When the treating apparatus Proc_1 of wearable device Dev_1 get in step sl from wearable device Dev_2 be numbered the 20 items scan video information of c_1 to c_10 and c_51 to c_60 time, wherein be numbered the front surface region that camera corresponding to the scan video information of c_1 to c_10 is positioned at user B, be numbered the back region that camera corresponding to the scan video information of c_51 to c_60 is positioned at user B.Treating apparatus Proc_1 points out user A: " display front or the back side? " and be numbered c_1 to c_10 based on the camera that user selects the operation in " front " to determine the scanning information projected.
Then, treating apparatus Proc_1 in wearable device Dev_1 is based on the corresponding relation in face model, determine the scanning information c_1 to c_10 projected, and the associative mode of predetermined projector and camera, determine to be projected to scanning information c_1 to c_10 respectively by the projector being numbered p_1 to p_10 in wearable device Dev_1.
Then, in step s3, treating apparatus transmits corresponding part output information respectively to described one or more projection end, projects in the part output information corresponding with self for each projection end group.
Preferably, treating apparatus can determine the projected position information of described one or more projection end based on the projection orientation information preset.Wherein, described projection orientation information is used to indicate the positional information of wearable device belonging to overall image that is projected, that be made up of described one or more part output information projection end one or more relative to this.Wherein, the projected position information of described projection end is used to indicate the positional information of the part output information corresponding with this projection end relative to this wearable device.
According to a preferred version of the present invention, wherein, treating apparatus can based on user's sight line of the user using described one or more projection to hold, determine to hold corresponding projected position information with described one or more projection, for the described projected position information that each projection end is respective, the part output information corresponding with self is projected.
Particularly, treating apparatus is according to user's sight line of user, determine projection orientation information, and and then determine described one or more projection end projected position information, for the described projected position information that each projection end is respective, the part output information corresponding with self is projected, is positioned at the precalculated position relative with described user with the image making corresponding projection end export.
Such as, the position in the dead ahead of user's sight line, or, be the anterior position etc. of a certain angle with user's sight line.
Wherein, treating apparatus can based on the projection end in precalculated position one or more in wearable device towards the direction of visual lines determining user; Or, based on the predetermined set of user, direction of visual lines can be determined.Such as, using be positioned at the projection end of position, front direction in vertical direction after translation preset distance as user's direction of visual lines; Again such as, using the projection end of user's preliminary election towards as direction of visual lines etc.
Then, this projected position information and corresponding part output information are transferred to corresponding projection end by treating apparatus in the lump, project in the part output information corresponding with self and projected position information for each projection end group.
Continue to be described foregoing First example, the projection orientation information " dead ahead " that treating apparatus is arranged based on user A, 50 the miniature holographic projector determining projection separately, corresponding to the projection angle information of this projection orientation information " dead ahead ".Then, treating apparatus transmits the projection angle information of corresponding part output information and correspondence respectively to these 50 miniature holographic projector, project based on the part output information corresponding with self and projection angle information for this 50 miniature holographic projector, thus present the complete image of picture image_1 in the dead ahead of user A.
According to a preferred embodiment of the invention, described every scanning information comprises scans content information and scanning position information, and described step S3 comprises step S301 (not shown).
In step S301, treating apparatus transmits corresponding part output information respectively to described one or more projection end, projects for the scanning position information of each projection end group in the part output information corresponding with self and scans content information.
Preferably, treating apparatus can determine the projected position information of described one or more projection end based on scanning position information.Then, this projected position information and corresponding part output information are transferred to corresponding projection end by treating apparatus in the lump, project in the part output information corresponding with self and projected position information for each projection end group.
Such as, when scanning information comprises the three-dimensional coordinate information of scan video information and correspondence, treating apparatus can based on the three-dimensional coordinate information corresponding with every scan video information, the projected position information that each projection determining projecting this every scan video information is held.Then, treating apparatus is respectively to each projection end corresponding scan video information of this transmission and projected position information.Each projection end is according to the three-dimensional coordinate information of the scan video information corresponding with self, determine the depth information that respective scanned video information is projected, and project based on described scan video information and corresponding projected position information thereof and depth information, thus present the stereopsis corresponding to every scan video information and three-dimensional coordinate information thereof at corresponding projected position.
Preferably, step S4 (not shown) and step S5 (not shown) is also comprised according to method of the present invention.
In step s 4 which, treating apparatus receive from multiple scanning ends corresponding with self, the scanning information of one or more scanning end.
In step s 5, received one or more scanning information is sent to another treating apparatus by treating apparatus.
Preferably, described treating apparatus, based on predetermined time interval, constantly performs above-mentioned steps S4 and step S5, constantly projects based on the one or more scanning information received for another treating apparatus described.
Such as, first treating apparatus receives the scanning information of the multiple scanning ends once corresponding to self every 0.1 second, and this scanning information is sent to second treating apparatus, can to project out constantly the image of the user of wearable device belonging to this first treating apparatus to make the projector equipment of the wearable device comprising this second treating apparatus.
According to method of the present invention, information to be output for projecting can be processed, the partial information projected respectively in information to be output is held by the multiple projections being contained in wearable device, and then complete the projection of a part or whole part of this information to be output is exported, make user can have more excellent visual experience.And, scanning information can be transmitted between wearable device according to the solution of the present invention, make user can based on the scanning information from other users received, project the virtual image corresponding with these other users, make two users can carry out interaction based on the virtual image of the other side respectively, improve Consumer's Experience.
Fig. 2 illustrates according to of the present invention a kind of for assisting multiple projection end to carry out the structural representation of the treating apparatus projected.Treatment in accordance with the present invention device comprises: acquisition device 1, determining device 2 and transmitting device 3.
With reference to Fig. 2, acquisition device 1 obtains information to be output.
Wherein, described information to be output includes but not limited to following any one information:
1) view relevant information; This view relevant information comprise from other equipment, can be used for the information of carrying out projecting, such as, the image file in smart mobile phone, video file or mobile phone interface image etc.
2) scanning information.Wherein, this scanning information comprises the scans content information from the one or more scanning ends corresponding with described treating apparatus, such as, and the scan image information corresponding with himself that scanning end obtains or scan video information.
Preferably, this scanning information also can comprise the scanning position information corresponding with this scans content information.Such as, three-dimensional coordinate information, two-dimensional coordinate information etc.
Then, determining device 2, according to described information to be output, is determined in described information to be output respectively, with described multiple one or more part output informations corresponding to holding respectively that project in holding that project.
Wherein, determining device 2, according to described information to be output, is determined in described information to be output respectively, with described multiple project hold in one or more project hold the mode of respectively corresponding part output informations include but not limited to following any one:
1) when described information to be output comprises view relevant information, described determining device 2 comprises sub-determining device (not shown) further and divides device (not shown).
Sub-determining device is first determined in described multiple projection end, for one or more projection ends of this information to be output that projects.Then, divide device based on predetermined division rule, described information to be output is divided into and holds part output information corresponding respectively with described one or more projection.
Preferably, described multiple projection end corresponds respectively to multiple regions of described wearable device.
Particularly, sub-determining device can according to described view relevant information, the quantity of the projection end of this information to be output that determines projecting and each regional location residing for projection end.Then, divide device based on predetermined division rule, described view relevant information is divided into multiple part output information and various piece output information is distributed to each projection end determined.
Preferably, sub-determining device also can determine the quantity of required projection end based on the attribute of described information to be output itself.
Such as, when information to be output is image information, based on the pixel height of image information, image information can be divided into " high-definition image " and " standard picture ", and determine that the more projection end of usage quantity projects " high-definition image ", and the less projection end of usage quantity projects " standard picture ".
According to the first example of invention, wearable device is an intelligent vest, and this intelligent vest comprises a treating apparatus, and 50 miniature holographic projector being uniformly distributed in vest front and side.Further, this intelligent vest is connected by the smart mobile phone of network with user A.When the acquisition device 1 in this intelligent vest gets the picture image_1 from this smart mobile phone, sub-determining device is based on the Pixel Information of this image_1, judge that this picture image_1 is high definition picture, and determine that all 50 the miniature holographic projector using this intelligent vest to comprise are to this picture image_1 that projects further.Then, divide device based on predetermined division rule, by performing image segmentation operations, this picture image_1 is divided into 50 part output informations according to area, and press from left to right, these 50 part output informations are not corresponded to 50 the miniature holographic projector comprised in this intelligent vest by order from top to bottom successively.
2) when described information to be output comprises one or more scanning information, described determining device 2 comprises the first selecting arrangement (not shown) and the second selecting arrangement (not shown).
At least one the scanning information for projecting selected by first selecting arrangement from described one or more scanning information.
Particularly, the first selecting arrangement can be selected based on user, or, based on the content selection mechanism preset, from described one or more scanning information, select at least one the scanning information for projecting.
Preferably, described scans content packets of information containing scanning end identification information, the scanning end that treating apparatus can be determined corresponding to this scans content information based on this identification information.
More preferably, described scanning end identification information comprises the numbering of each scanning end corresponding with described treating apparatus.
Preferably, described treating apparatus also comprises pattern determining device (not shown).Pattern determining device first determines scanning projection pattern, then, the first selecting arrangement based on determined scanning projection pattern, by least one the scanning information selected in described one or more scanning information for projecting.
Wherein, described scanning projection pattern comprises and can arrange for carrying out the pattern projected.Such as, when scanning information corresponds to multiple position of user's body, scanning projection pattern can comprise front/back/lateral mode, projects in the scanning information corresponding with corresponding body part for projection end group.
Then, during second selecting arrangement is held by described multiple projection, select at least one corresponding with the every scanning information difference in described at least one scanning information to project to hold, using the part output information that this every scanning information is held as each projection corresponding at least one projection end.
According to the second example of the present invention, user A wears the wearable device Dev_1 of a clothes form, wherein, this wearable device Dev_1 comprises treating apparatus Proc_1, this treating apparatus Proc_1 and 50 the miniature holographic projector being uniformly distributed in this clothes front and lateral side regions, and it is corresponding with 100 minisize pick-up heads of rear surface regions to be uniformly distributed in this clothes front, side, wherein each projector is in order from p_1 numbering to p_50, and each minisize pick-up head is in order from c_1 numbering to c_100; User B wears the wearable device Dev_2 of style identical with Dev_1, this wearable device comprises treating apparatus Proc_2, and the miniature holographic projector p_1 to p_50 corresponding with this treating apparatus Proc_2, and the minisize pick-up head c1 to c100 corresponding with Proc_2.Wherein, on the distribution mode of Dev_2 each projector upper and camera and Dev_1, the distribution mode of each projector and camera is identical.
Wherein, Dev_1 comprises the associative mode of two kinds of predetermined projector and camera: projector p_x corresponding camera c_x respectively in associative mode " face model ", projector p_x corresponding camera c_ (50+x) respectively in associative mode " face model "; Wherein x is the integer of span in [1,50].
When the acquisition device 1 of the treating apparatus Proc_1 of wearable device Dev_1 get from wearable device Dev_2 be numbered the 20 items scan video information of c_1 to c_10 and c_51 to c_60 time, wherein be numbered the front surface region that camera corresponding to the scan video information of c_1 to c_10 is positioned at user B, be numbered the back region that camera corresponding to the scan video information of c_51 to c_60 is positioned at user B.The pattern determining device prompting user A for the treatment of apparatus Proc_1: " display front or the back side? ", the camera that the first selecting arrangement selects the operation in " front " to determine the scanning information projected based on user is numbered c_1 to c_10.
Then, the second selecting arrangement in wearable device Dev_1 is based on the corresponding relation in face model, determine the scanning information c_1 to c_10 projected, and the associative mode of predetermined projector and camera, determine to be projected to scanning information c_1 to c_10 respectively by the projector being numbered p_1 to p_10 in wearable device Dev_1.
Then, transmitting device 3 transmits corresponding part output information respectively to described one or more projection end, projects in the part output information corresponding with self for each projection end group.
Preferably, treating apparatus can determine the projected position information of described one or more projection end based on the projection orientation information preset.Wherein, described projection orientation information is used to indicate the positional information of wearable device belonging to overall image that is projected, that be made up of described one or more part output information projection end one or more relative to this.Wherein, the projected position information of described projection end is used to indicate the positional information of the part output information corresponding with this projection end relative to this wearable device.
According to a preferred version of the present invention, wherein, treating apparatus also comprises sight line determining device (not shown) and position determining means (not shown).
Sight line determining device determines the direction of visual lines of the user using described multiple projection end; Position determining means, based on the direction of visual lines of described user, is determined to hold corresponding projected position information with described one or more projection, for the described projected position information that each projection end is respective, projects to the part output information corresponding with self.
Particularly, position determining means is according to user's sight line of user, determine projection orientation information, and and then determine described one or more projection end projected position information, for the described projected position information that each projection end is respective, the part output information corresponding with self is projected, is positioned at the precalculated position relative with described user with the image making corresponding projection end export.
Such as, the position in the dead ahead of user's sight line, or, be the anterior position etc. of a certain angle with user's sight line.
Wherein, sight line determining device can based on the projection end in precalculated position one or more in wearable device towards the direction of visual lines determining user; Or, based on the predetermined set of user, direction of visual lines can be determined.Such as, using be positioned at the projection end of position, front direction in vertical direction after translation preset distance as user's direction of visual lines; Again such as, using the projection end of user's preliminary election towards as direction of visual lines etc.
Then, this projected position information and corresponding part output information are transferred to corresponding projection end by transmitting device 3 in the lump, project in the part output information corresponding with self and projected position information for each projection end group.
Continue to be described foregoing First example, the projection orientation information " dead ahead " that treating apparatus is arranged based on user A, 50 the miniature holographic projector determining projection separately, corresponding to the projection angle information of this projection orientation information " dead ahead ".Then, transmitting device 3 in this smart mobile phone transmits the projection angle information of corresponding part output information and correspondence respectively to these 50 miniature holographic projector, project based on the part output information corresponding with self and projection angle information for this 50 miniature holographic projector, thus present the complete image of picture image_1 in the dead ahead of user A.
According to a preferred embodiment of the invention, described every scanning information comprises scans content information and scanning position information, described transmitting device 3 transmits corresponding part output information respectively to described one or more projection end, projects for the scanning position information of each projection end group in the part output information corresponding with self and scans content information.
Preferably, treating apparatus can determine the projected position information of described one or more projection end based on scanning position information.Then, this projected position information and corresponding part output information are transferred to corresponding projection end by transmitting device 3 in the lump, project in the part output information corresponding with self and projected position information for each projection end group.
Such as, when scanning information comprises the three-dimensional coordinate information of scan video information and correspondence, treating apparatus can based on the three-dimensional coordinate information corresponding with every scan video information, the projected position information that each projection determining projecting this every scan video information is held.Then, transmitting device 3 is respectively to each projection end corresponding scan video information of this transmission and projected position information.Each projection end is according to the three-dimensional coordinate information of the scan video information corresponding with self, determine the depth information that respective scanned video information is projected, and project based on described scan video information and corresponding projected position information thereof and depth information, thus present the stereopsis corresponding to every scan video information and three-dimensional coordinate information thereof at corresponding projected position.
Preferably, receiving trap (not shown) and dispensing device (not shown) is also comprised according to method of the present invention.
Receiving trap receive from multiple scanning ends corresponding with self, the scanning information of one or more scanning end.
Received one or more scanning information is sent to another treating apparatus by dispensing device.
Preferably, described receiving trap and dispensing device are based on predetermined time interval, the above-mentioned reception of continuous execution from multiple scanning ends corresponding with self, the step of the scanning information of one or more scanning end and received one or more scanning information is sent to the step of another treating apparatus, constantly project based on the one or more scanning information received for another treating apparatus described.
Such as, the receiving trap of first treating apparatus receives the scanning information of the multiple scanning ends once corresponding to self every 0.1 second, and by dispensing device, this scanning information is sent to second treating apparatus, can to project out constantly the image of the user of wearable device belonging to this first treating apparatus to make the projector equipment of the wearable device comprising this second treating apparatus.
According to the solution of the present invention, information to be output for projecting can be processed, the partial information projected respectively in information to be output is held by the multiple projections being contained in wearable device, and then complete the projection of a part or whole part of this information to be output is exported, make user can have more excellent visual experience.And, scanning information can be transmitted between wearable device according to the solution of the present invention, make user can based on the scanning information from other users received, project the virtual image corresponding with these other users, make two users can carry out interaction based on the virtual image of the other side respectively, improve Consumer's Experience.
Software program of the present invention can perform to realize step mentioned above or function by processor.Similarly, software program of the present invention (comprising relevant data structure) can be stored in computer readable recording medium storing program for performing, such as, and RAM storer, magnetic or CD-ROM driver or flexible plastic disc and similar devices.In addition, steps more of the present invention or function can adopt hardware to realize, such as, as coordinating with processor thus performing the circuit of each function or step.
In addition, a part of the present invention can be applied to computer program, such as computer program instructions, when it is performed by computing machine, by the operation of this computing machine, can call or provide according to method of the present invention and/or technical scheme.And call the programmed instruction of method of the present invention, may be stored in fixing or moveable recording medium, and/or be transmitted by the data stream in broadcast or other signal bearing medias, and/or be stored in the working storage of the computer equipment run according to described programmed instruction.At this, comprise a device according to one embodiment of present invention, this device comprises the storer for storing computer program instructions and the processor for execution of program instructions, wherein, when this computer program instructions is performed by this processor, trigger this plant running based on the aforementioned method according to multiple embodiment of the present invention and/or technical scheme.
To those skilled in the art, obviously the invention is not restricted to the details of above-mentioned one exemplary embodiment, and when not deviating from spirit of the present invention or essential characteristic, the present invention can be realized in other specific forms.Therefore, no matter from which point, all should embodiment be regarded as exemplary, and be nonrestrictive, scope of the present invention is limited by claims instead of above-mentioned explanation, and all changes be therefore intended in the implication of the equivalency by dropping on claim and scope are included in the present invention.Any Reference numeral in claim should be considered as the claim involved by limiting.In addition, obviously " comprising " one word do not get rid of other unit or step, odd number does not get rid of plural number.Multiple unit of stating in system claims or device also can be realized by software or hardware by a unit or device.First, second word such as grade is used for representing title, and does not represent any specific order.
Claims (16)
1. assist multiple projection end to carry out the method projected, wherein said multiple projection end is corresponding with treating apparatus, wherein, said method comprising the steps of:
A obtains information to be output;
B, according to described information to be output, determines in described information to be output respectively, with described multiple one or more part output informations corresponding to holding respectively that project in holding that project;
C transmits corresponding part output information respectively to described one or more projection end, projects in the part output information corresponding with self for each projection end group.
2. method according to claim 1, wherein, when described information to be output comprises view relevant information, described step b comprises the following steps:
-determine in described multiple projection end, for one or more projection ends of this information to be output that projects;
-based on predetermined division rule, described information to be output is divided into and holds part output information corresponding respectively with described one or more projection.
3. method according to claim 1, wherein, when described information to be output comprises one or more scanning information, described step b comprises the following steps:
B1 selects at least one the scanning information for projecting from described one or more scanning information;
During b2 is held by described multiple projection, select at least one corresponding with the every scanning information difference in described at least one scanning information to project to hold, using the part output information that this every scanning information is held as each projection corresponding at least one projection end.
4. method according to claim 3, wherein, described every scanning information comprises scans content information and scanning position information, and wherein, described step c is further comprising the steps of:
-transmit corresponding part output information respectively to described one or more projection end, project for the scanning position information of each projection end group in the part output information corresponding with self and scans content information.
5. method according to any one of claim 1 to 4, wherein, described method is further comprising the steps of:
-determine the direction of visual lines using described multiple user held that projects;
-based on the direction of visual lines of described user, determine to hold corresponding projected position information with described one or more projection, for the described projected position information that each projection end is respective, the part output information corresponding with self is projected.
6. the method according to any one of claim 2 to 5, wherein, described method is further comprising the steps of:
-determine scanning projection pattern;
Wherein, described step b1 comprises the following steps:
-based on determined scanning projection pattern, by least one the scanning information selected in described one or more scanning information for projecting.
7. the method according to any one of claim 3 to 6, wherein, described treating apparatus is also corresponding with one or more scanning end, and wherein, described method is further comprising the steps of:
-receive scanning information from one or more scanning end;
-received one or more scanning information is sent to another treating apparatus.
8. method according to any one of claim 1 to 7, wherein, described treating apparatus and the multiple projection ends corresponding with it are all contained in a wearable device.
9. assist multiple projection end to carry out the treating apparatus projected, wherein said multiple projection end is corresponding with treating apparatus, and wherein, described treating apparatus comprises:
Acquisition device, for obtaining information to be output;
Determining device, for according to described information to be output, determines in described information to be output respectively, with described multiple one or more part output informations corresponding to holding respectively that project in holding that project;
Transmitting device, for transmitting corresponding part output information respectively to described one or more projection end, projects in the part output information corresponding with self for each projection end group.
10. treating apparatus according to claim 9, wherein, when described information to be output comprises view relevant information, described determining device comprises:
Sub-determining device, for determining in described multiple projection end, for one or more projection ends of this information to be output that projects;
Divide device, for based on predetermined division rule, described information to be output is divided into and holds part output information corresponding respectively with described one or more projection.
11. treating apparatus according to claim 9, wherein, when described information to be output comprises one or more scanning information, described determining device comprises:
First selecting arrangement, for selecting at least one the scanning information for projecting from described one or more scanning information;
Second selecting arrangement, in being held by described multiple projection, select at least one corresponding with the every scanning information difference in described at least one scanning information to project to hold, using the part output information that this every scanning information is held as each projection corresponding at least one projection end.
12. treating apparatus according to claim 11, wherein, described every scanning information comprises scans content information and scanning position information, and wherein, described transmitting device is used for:
-transmit corresponding part output information respectively to described one or more projection end, project for the scanning position information of each projection end group in the part output information corresponding with self and scans content information.
13. treating apparatus according to any one of claim 9 to 12, wherein, described treating apparatus also comprises:
Sight line determining device, for determining the direction of visual lines of the user using described multiple projection end;
Position determining means, for the direction of visual lines based on described user, determines to hold corresponding projected position information with described one or more projection, for the described projected position information that each projection end is respective, projects to the part output information corresponding with self.
14. according to claim 10 to the treating apparatus according to any one of 13, and wherein, handled device also comprises:
Pattern determining device, for determining scanning projection pattern;
Wherein, described first selecting arrangement is used for:
-based on determined scanning projection pattern, by least one the scanning information selected in described one or more scanning information for projecting.
15. according to claim 11 to the treating apparatus according to any one of 14, and wherein, described treating apparatus is also corresponding with one or more scanning end, and wherein, described treating apparatus also comprises:
Receiving trap, for receiving the scanning information from one or more scanning end;
Dispensing device, for being sent to another treating apparatus by received one or more scanning information.
16. treating apparatus according to any one of claim 9 to 15, wherein, described treating apparatus and the multiple projections ends corresponding with it are all contained in a wearable device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410842843.6A CN104571512B (en) | 2014-12-30 | 2014-12-30 | A kind of method and apparatus for assisting multiple projection ends to be projected |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410842843.6A CN104571512B (en) | 2014-12-30 | 2014-12-30 | A kind of method and apparatus for assisting multiple projection ends to be projected |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104571512A true CN104571512A (en) | 2015-04-29 |
CN104571512B CN104571512B (en) | 2017-11-24 |
Family
ID=53087790
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410842843.6A Expired - Fee Related CN104571512B (en) | 2014-12-30 | 2014-12-30 | A kind of method and apparatus for assisting multiple projection ends to be projected |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104571512B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019123139A1 (en) * | 2017-12-21 | 2019-06-27 | International Business Machines Corporation | Determine and project holographic object path and object movement with multi-device collaboration |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5673084A (en) * | 1992-04-17 | 1997-09-30 | Goldstar Co., Ltd. | Movie camera system having view finding and projecting operations and method |
CN1570906A (en) * | 2003-07-14 | 2005-01-26 | 活跃动感科技股份有限公司 | Projection playing system and playing method thereof |
CN103728727A (en) * | 2013-12-19 | 2014-04-16 | 财团法人车辆研究测试中心 | Information display system and display method for automatically adjusting visual range |
CN104166236A (en) * | 2013-05-17 | 2014-11-26 | 许振宇 | Multimedia projection glasses |
-
2014
- 2014-12-30 CN CN201410842843.6A patent/CN104571512B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5673084A (en) * | 1992-04-17 | 1997-09-30 | Goldstar Co., Ltd. | Movie camera system having view finding and projecting operations and method |
CN1570906A (en) * | 2003-07-14 | 2005-01-26 | 活跃动感科技股份有限公司 | Projection playing system and playing method thereof |
CN104166236A (en) * | 2013-05-17 | 2014-11-26 | 许振宇 | Multimedia projection glasses |
CN103728727A (en) * | 2013-12-19 | 2014-04-16 | 财团法人车辆研究测试中心 | Information display system and display method for automatically adjusting visual range |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019123139A1 (en) * | 2017-12-21 | 2019-06-27 | International Business Machines Corporation | Determine and project holographic object path and object movement with multi-device collaboration |
US10571863B2 (en) | 2017-12-21 | 2020-02-25 | International Business Machines Corporation | Determine and project holographic object path and object movement with mult-device collaboration |
CN111373450A (en) * | 2017-12-21 | 2020-07-03 | 国际商业机器公司 | Determining and projecting holographic object paths and object movements with multi-device collaboration |
US10754297B2 (en) | 2017-12-21 | 2020-08-25 | International Business Machines Corporation | Determine and project holographic object path and object movement with multi-device collaboration |
GB2582725A (en) * | 2017-12-21 | 2020-09-30 | Ibm | Determine and project holographic object path and object movement with multi-device collaboration |
GB2582725B (en) * | 2017-12-21 | 2021-06-23 | Ibm | Determine and project holographic object path and object movement with multi-device collaboration |
CN111373450B (en) * | 2017-12-21 | 2024-03-26 | 国际商业机器公司 | Determining and projecting holographic object paths and object movements using multi-device collaboration |
Also Published As
Publication number | Publication date |
---|---|
CN104571512B (en) | 2017-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110288692B (en) | Illumination rendering method and device, storage medium and electronic device | |
US12243170B2 (en) | Live in-camera overlays | |
US8781161B2 (en) | Image processing method and apparatus for generating a 3D model of a target object | |
US8780119B2 (en) | Reconstruction render farm used in motion capture | |
KR101556992B1 (en) | 3d scanning system using facial plastic surgery simulation | |
KR20080069601A (en) | One or more computer-readable media that store information that enables a device to execute a process for gaming stereo video. | |
EP2342900A1 (en) | Generation of occlusion data for image properties | |
CN105611267B (en) | Merging of real world and virtual world images based on depth and chrominance information | |
CN111080776A (en) | Processing method and system for human body action three-dimensional data acquisition and reproduction | |
CN108833877A (en) | Image processing method and device, computer installation and readable storage medium storing program for executing | |
CN109407824B (en) | Method and device for synchronous motion of human body model | |
KR101805636B1 (en) | Automatic extracting system for 3d digital image object based on 2d digital image and extracting method using thereof | |
KR101451236B1 (en) | Method for converting three dimensional image and apparatus thereof | |
CN104571512A (en) | Method and device for assisting multiple projection ends in projection | |
CN109712230A (en) | Threedimensional model compensation process, device, storage medium and processor | |
CN112348965B (en) | Imaging method, imaging device, electronic equipment and readable storage medium | |
JP7479793B2 (en) | Image processing device, system for generating virtual viewpoint video, and method and program for controlling the image processing device | |
JP2022028091A (en) | Image processing device, image processing method, and program | |
CN105631938B (en) | Image processing method and electronic equipment | |
TW201006527A (en) | Measuring object contour method and measuring object contour apparatus | |
CN114445648A (en) | Obstacle recognition method, apparatus and storage medium | |
CN113485547B (en) | An interactive method and device for holographic sandbox | |
CN111369612A (en) | Three-dimensional point cloud image generation method and equipment | |
KR100879802B1 (en) | Method and apparatus for generating 3D image from virtual viewpoint | |
CN111754543B (en) | Image processing method, device and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20210108 Address after: 310052 room 508, 5th floor, building 4, No. 699 Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province Patentee after: Alibaba (China) Co.,Ltd. Address before: 100080 room 701-52, 7th floor, 2 Haidian East 3rd Street, Haidian District, Beijing Patentee before: ZHUOYI CHANGXIANG (BEIJING) TECHNOLOGY Co.,Ltd. |
|
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20171124 |