[go: up one dir, main page]

CN103946732B - Video display modification based on sensor input to see-through, near-eye displays - Google Patents

Video display modification based on sensor input to see-through, near-eye displays Download PDF

Info

Publication number
CN103946732B
CN103946732B CN201280046955.XA CN201280046955A CN103946732B CN 103946732 B CN103946732 B CN 103946732B CN 201280046955 A CN201280046955 A CN 201280046955A CN 103946732 B CN103946732 B CN 103946732B
Authority
CN
China
Prior art keywords
light
image
display
film
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201280046955.XA
Other languages
Chinese (zh)
Other versions
CN103946732A (en
Inventor
J·D·哈迪克
R·F·奥斯特豪特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Publication of CN103946732A publication Critical patent/CN103946732A/en
Application granted granted Critical
Publication of CN103946732B publication Critical patent/CN103946732B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to near-field communication (NFC) equipment for including the electronic equipment for enabling NFC being worn in wrist, wherein the electronic equipment for enabling NFC being worn in wrist includes the first communication link that the electronic equipment for enabling NFC with second via NFC protocol is communicated and the second communication link for being communicated and being received control command with eyepiece via medium-range communication protocols.The electronic equipment for enabling NFC being worn in wrist promotes the data transmission between the electronic equipment that eyepiece and second enable NFC.Eyepiece includes the optical device for realizing the see-through display for showing data on it.

Description

Video based on the sensor input to perspective, near-eye display shows modification
Cross reference to related applications
This application claims the priority of following U.S. Provisional Patent Application, these applications are contained in by quoting its entirety This:
The U.S. Provisional Application 61/539,269 that September in 2011 is submitted on the 26th.
The application is that the part of following U.S. Non-provisional Patent application continues, each of these applications is whole by quoting its Body is contained in this:
The U.S. non-provisional application 13/591,187 that August in 2012 is submitted on the 21st, this application requires following provisional application Equity, each of these applications are contained in this by quoting its entirety: the US provisional patent submitted on the 3rd of August in 2012 Application 61/679,522;The U.S. Provisional Patent Application 61/679,558 that August in 2012 is submitted on the 3rd;It submits on August 3rd, 2012 U.S. Provisional Patent Application 61/679,542;The U.S. Provisional Patent Application 61/679,578 that August in 2012 is submitted on the 3rd; The U.S. Provisional Patent Application 61/679,601 that August in 2012 is submitted on the 3rd;On August 3rd, the 2012 US provisional patent Shens submitted It please 61/679,541;The U.S. Provisional Patent Application 61/679,548 that August in 2012 is submitted on the 3rd;It submits on August 3rd, 2012 U.S. Provisional Patent Application 61/679,550;The U.S. Provisional Patent Application 61/679,557 that August in 2012 is submitted on the 3rd;2012 The U.S. Provisional Patent Application 61/679,566 that on August is submitted for 3;On May 8th, 2012 U.S. Provisional Patent Application submitted 61/644,078;The U.S. Provisional Patent Application 61/670,457 that on July 11st, 2012 submits;And on July 23rd, 2012 mentions The U.S. Provisional Patent Application 61/674,689 of friendship.
The U.S. non-provisional application 13/441,145 that on April 6th, 2012 submits, this application requires the power of following provisional application Benefit, each of these applications are contained in this by quoting its entirety: on 2 14th, the 2012 US provisional patent Shens submitted It please 61/598,885;2 months 2012 U.S. Provisional Patent Applications 61/598,889 submitted for 14th;It submits on 2 12nd, 2012 U.S. Provisional Patent Application 61/598,896;And 2 months 2012 U.S. Provisional Patent Applications 61/604 submitted for 29th, 917。
The U.S. non-provisional application 13/429,413 that on March 25th, 2012 submits, this application requires following provisional application Equity, these applications are contained in this by quoting its entirety: the U.S. Provisional Patent Application 61/ that on January 6th, 2012 submits 584,029。
The U.S. non-provisional application 13/341,758 that on December 30th, 2011 submits, this application requires following provisional application Equity, these application each by reference its entirety be contained in this: on November 8th, 2011 US provisional patent submitted Application 61/557,289.
The U.S. non-provisional application 13/232,930 that September in 2011 is submitted on the 14th, this application requires following provisional application Equity, each of these applications are contained in this by quoting its entirety: the U.S. Provisional Application submitted on the 14th of September in 2010 61/382,578;The U.S. Provisional Application 61/472,491 that on April 6th, 2011 submits;On May 6th, 2011, the U.S. submitted faced When apply for 61/483,400;The U.S. Provisional Application 61/487,371 that on May 18th, 2010 submits;And on July 5th, 2011 mentions The U.S. Provisional Application 61/504,513 of friendship.
What the U.S. Non-provisional Patent application 13/037,324 and 2011 year submitted for 28 days 2 months in 2011 was submitted for 28 days 2 months U.S. Non-provisional Patent application 13/037,335, each of the two applications require the equity of following provisional application, these Each of provisional application is contained in this by quoting its entirety: on 2 28th, 2010 U.S. Provisional Patent Applications submitted 61/308,973;The U.S. Provisional Patent Application 61/373,791 that August in 2010 is submitted on the 13rd;It submits on September 14th, 2010 U.S. Provisional Patent Application 61/382,578;The U.S. Provisional Patent Application 61/410,983 that on November 8th, 2010 submits;2011 The U.S. Provisional Patent Application 61/429,445 that on January 3, in submits;And the US provisional patent Shen that on January 3rd, 2011 submits It please 61/429,447.
Background technique
Field:
The present invention relates to augmented reality eyepiece, associated control technology and applications, more particularly to operate on eyepiece Software application.
The invention further relates to use in the changeable eyeglass of serializing mode to provide the thin display technology of image from waveguide.
In this industry, the head-mounted display with reflecting surface is well known.It is described in United States Patent (USP) 4969714 Head-mounted display with single oblique angle part reflection beam splitting chip.Although this method provides Zhuo on the shown visual field The uniformity of brightness and color more, but optical system is relatively thick due to oblique angle beam splitting chip.
It is described in United States Patent (USP) 6829095 and 7724441 with partially reflecting surface array to provide relatively thin optics The head-mounted display of system, these displays are shown in Figure 124, and part of reflection surface array 12408 is used for It shows and image light 12404 is provided on the visual field, so that user be allowed to check shown image, together with the view of the environment before user Figure.The image light 12404 that user is checked is by the combined reflected smooth structure from each of multiple portions reflecting surface 12408 At.Light from image source 12402 must be by multiple portions reflecting surface 12408, and wherein a part of light 12402 is to user Eye reflections, to provide image light 12404.In order to provide the homogeneous image on the display visual field, partially reflecting surface 12408 Reflection characteristic must be accurately controlled.For the surface nearest from image source, the reflectivity of partially reflecting surface 12408 is necessary It is minimum, and for the surface farthest from image source, the reflectivity of partially reflecting surface 12408 must highest.In general, part The reflectivity of reflecting surface 12408 must be relative to linearly increasing with a distance from image source.This present manufactures and cost to ask Topic, because the reflectivity of each section reflecting surface 12408 is different from adjacent surface, and the reflectivity on each surface must be tight Close control.Therefore, using partially reflecting surface array, it is difficult to which providing has uniform brightness and color on the entirely display visual field Image.
Alternatively, diffraction grating is used to for the image light for passing in and out waveguide being redirected to as described in United States Patent (USP) 4711512 Show the visual field.However, diffraction grating is with high costs and there are color aberrations.
To persistently exist aobvious to the wear-type for also providing the image conformity of good brightness and color on the display visual field Show the demand of the relatively thin optical system of device.
The invention further relates to include wire grid polarizer film as partially reflecting surface so that irradiation light is deflected downwards to instead Penetrate compact and light weight the headlight of image source.
In the display with reflectogram image source and headlight as shown in Figure 133, irradiation light 13308 is passed from edge-light Source 13300, and deflected by headlight 13304 to irradiate reflectogram image source 13302.Irradiation light 13308 and then self-reflection image source 13302 reflections, become image light 13310, then image light 13310 is passed back by headlight 13304 and enters display optics. Headlight 13304 deflects the irradiation light 13308 entered from edge light 13300 and allows reflected image light simultaneously as a result, 13310 by without deflected, therefore image light 13310 can be passed in display optics, and wherein display optics are aobvious Showing can be dispersion when device is flat screen display, or can be refraction or diffraction when display is near-eye display 's.In this embodiment, display optics may include diffuser.
To the reflectogram image source of such as liquid crystal over silicon (LCOS) image source etc, irradiation light is polarization, and reflected image Source includes that quarter-wave delays film, the polarization state during membrane change self-reflection image source reflection.Then in display optics It include polarizer in device, the polarization effect that it assigns liquid crystal forms a figure when image light passes through display optics Picture.
U.S. Patent application 7163330 describes a series of headlights, and the slot in the upper surface including headlight is so as to come from side Flat of the light of edge light source between slot is deflected downwards to reflectogram image source, to allow the incoming display of reflected image light In optical device.Figure 134 shows the diagram of the headlight 13400 with slot 13410 and flat 13408.From edge light 13300 irradiation light 13402 is reflected from slot 13410, and is deflected down to irradiate reflectogram image source 13302.Image light 13404 Self-reflection image source 13302 reflects, and passes through the flat 13408 of headlight 13400.Describe linear and curved slot 13410.However, slot 13410 necessarily takes up the very big of headlight for the slot 13410 for effectively deflecting irradiation light 13402 Area, to limit the area of flat 13408, and since light makes to provide when passing back by headlight from slot scattering To the image quality degradation of display optics.Headlight 13400 is usually formed by solid material piece, and therefore may be relatively Weight.
In United States Patent (USP) 7545571, wearable display system is provided, it includes reflectogram image source 13502, the image Source has polarization beam apparatus 13512 as headlight so that the irradiation light 13504 that edge light 13500 provides is deflected and polarized anti- It penetrates in image source 13502, as shown in Figure 135.Polarization beam apparatus 13512 is the oblique angle plane in solid block, is had and edge The associated independent curved surface reflector 13514 of light source 13500.Curved surface reflector 13514, which can be, is connected to polarization beam apparatus 13512 total internal reflection block 13510.As a result, disclosed in this patent with polarization beam apparatus solid block and total internal reflection block Headlight provides large-scale and relatively heavy headlight.In addition, image light 13508 is also shown in Figure 135.
There are still the demand of headlight is provided for the display with reflectogram image source, this headlight is mentioned with seldom scattering light It is for good picture quality and still compact and light weight.
The invention further relates to the optically flat surfaces made of optical film.More specifically, the present invention is provided to use light Learn the method that film manufactures optically flat beam splitter.
Optical film can be obtained for various purposes, comprising: beam splitter, polarization beam apparatus, holographic reflector and eyeglass.? Imaging applications are especially, it is specified that optical film is very flat is important with saving image wavefront in catoptric imaging application.Certain light Learn film has contact adhesive on side, obtains structural support to allow optical film to be attached to substrate, and assists to make optical film Keep flat.However, the optical film for being attached to substrate by this method often have be referred to as orange peel small scale rise and fall and The surface of point, it is optically flat that this prevents surface from realizing, and the image therefore reflected is downgraded.
In U.S. Patent application 20090052030, the method for manufacturing optical film is provided, wherein optical film is line Grid polarizer.However, not for providing the technology of the film with optically flat property.
In United States Patent (USP) 4537739 and 4643789, provide for original image to be transported to mold by using band and The method for making original image be attached to molded structure.However, these methods do not predict the particular/special requirement of optical film.
In U.S. Patent application 20090261490, provide for manufacture include optical film simple optical goods with And the method for molding.This method is directed to curved surface generated, because this method includes between radius of curvature and the ratio of diameter It limits to avoid because of gauffer in film caused by the deformation of film during molding.There is the optically flat surface of optical film for manufacture Particular/special requirement be not selected.
In United States Patent (USP) 7820081, the method that functional film layer is laminated to lens is provided.This method is viscous using heat cure Functional membrane is adhered to lens by mixture.However, thermo formed optical film when this technique is included in lens heat, so that optics The deformation together during technique for sticking of film, adhesive and lens.This method is unsuitable for manufacturing optically flat surface as a result,.
Therefore, there are still to using optical film so that can provide the surface including optical film with optically flat property The demand of method.
Summary of the invention
In embodiments, eyepiece may include the in house software application operated in integrated multimedia calculating mechanism, this is answered It is shown and is interacted with eyepiece with 3D augmented reality (AR) content is suitable for.3D AR software application is developed simultaneously in combination with mobile application It is provided by application shop, or as being used alone, as finally using platform and by special 3D specifically for eyepiece AR eyepiece shop provides.In house software application can be output and input with by eyepiece by what the inside and outside mechanism of eyepiece provided Mechanism docking such as captures equipment, inter-process mechanism, internal multi-media processing from ambient enviroment, sensor device, user action Mechanism, other internal applications, camera, sensor, microphone, by transceiver, by tactile interface, from outer computer structure, outer The initiation such as portion's application, event and/or data feeding, external equipment, third party.The order and control model operated in conjunction with eyepiece can By sensing reception, inside by the input of input equipment, user action, external equipment interaction, event and/or data feeding Application execution, applications, which execute etc., to be initiated.In embodiments, may be present as by house software application provide, It is included in the series of steps executed in control, the combination including two at least the following: event and/or data feedback It send, sensing input and/or sensing equipment, user action capture input and/or output, the user for controlling and/initiation is ordered Order can be used to respond input in mobile and/or movement, order and/or control model and interface (wherein input can be reflected) Platform on application, interface is answered to the communication and/or connection of external system and/or equipment, external equipment, outside from platform With, to feedback (such as about external equipment, applications) of user etc..
The present invention also provides for providing the method for relatively thin optical system, which above mentions in the display visual field For the image with improved brightness and color uniformity.The present invention includes the whole battle array of the narrow changeable eyeglass on display area Column, to provide the display visual field, wherein changeable eyeglass is sequentially used to reflect each section of the light from image source, thus to The sequential partial of user's presentation image.By, from transparent to narrow changeable eyeglass is switched rapidly reflectingly, being used according to repetitive sequence Family perceives each section to be combined at the whole image such as provided by image source in image.Assuming that each narrow changeable eyeglass Switch by 60Hz or higher frequency, user will not perceive the flashing in image each section.
Provide each embodiment of narrow switchable mirror chip arrays.In one embodiment, changeable eyeglass is that liquid crystal can Switch eyeglass.In another embodiment, changeable eyeglass is that moving for changeable total internal reflection eyeglass is provided using air gap Prism element.
In an alternate embodiment, not all changeable eyeglass is all sequentially used, on the contrary, using based on user's Changeable eyeglass in eye spacing and change selected group.
The present invention also provides include wire-grid polarizer film as partially reflecting surface so that irradiation light is deflected down to anti- Penetrate compact and light weight the headlight of image source.Edge light is polarized, and wire-grid polarizer is directed, so that irradiation light is anti- It penetrates and image light is allowed through and passes to display optics.By using wire-grid polarizer film flexible, the present invention is provided Partially reflecting surface, which can be curved to focusing on irradiation light into reflectogram image source, to improve efficiency and improve The uniformity of brightness of image.Also there is wire-grid polarizer low-down light to scatter, because image light is going to display optics device By headlight in the way of part, therefore picture quality is kept.Further, since partially reflecting surface is wire-grid polarizer film, headlight Major part be made of air, therefore headlight is light in weight.
The present invention also provides for manufacturing the method with the surface of optically flat property when using optical film.In this hair In bright each embodiment, optical film may include beam splitter, polarization beam apparatus, wire-grid polarizer, eyeglass, part lens or or holographic Film.Present invention provide advantages in that: the surface of optical film is optically flat, so that the wavefront of light is kept to provide and change Into picture weight.
In certain embodiments, the present invention provides the image display systems including optically flat optical film.Optics is flat Smooth optical film includes the lining that optically flat optical film is kept in image source and the display module shell for checking position Bottom.The image that wherein image source provides is reflected into from optical film checks position, and the substrate with optical film is outside display module It can be replaced in shell.
In other embodiments of the invention, optical film is attached to molded structure, therefore optical film is display module shell A part.
Existing as shown in Figure 187, with reflectogram image source 18720 and solid beam splitter square headlight 18718 In technology display 18700, light 18712 passes to diffuser 18704 from light source 18702, makes it more evenly to provide there Irradiation light 18714.Irradiation light 18714 is partially reflected the redirection of layer 18708, to irradiate reflectogram image source 18720.Irradiation light The 18714 then reflections of self-reflection image source 18720, become image light 18710, then image light 18710 passes through partially reflective layer 18708 pass and enter associated image forming optics (not shown) back, and image is presented to viewer in image forming optics.By This, solid beam splitter square 18718 redirects irradiation light 18714, and the image light 18710 for allowing to reflect simultaneously is by without quilt It redirects, therefore image light can be passed to image forming optics, wherein image forming optics are flat screen displays in display When can be dispersion, or can be refraction or diffraction when display is projector or near-eye display.
To the reflectogram image source of such as liquid crystal over silicon (LCOS) image source etc, irradiation light is polarization, and in irradiation light When self-reflection image source reflects, reflectogram image source changes polarization state based on the picture material that image source is presented, to form figure As light.It then include analyzer polarizer, the polarization effect that it assigns LCOS passes through image forming optics in image light When form an image, and image is presented to viewer.
In United States Patent (USP) 7545571, wearable display system is provided, it includes reflectogram image source, and image source has Polarization beam apparatus is as headlight so that the irradiation light that edge light provides is deflected and polarized on reflectogram image source.Polarization beam apparatus It is the oblique angle plane in solid block, there is independent curved surface reflector associated with edge light.Curved surface reflector can be company It is connected to the total internal reflection block of polarization beam apparatus.There is disclosed in this patent polarization beam apparatus solid block and total internal reflection as a result, The headlight of block provides large-scale and relatively heavy headlight.
United States Patent (USP) 6195136 discloses a series of headlight illuminating methods for reflectogram image source.It discloses for making The more compact method using curved surface beam splitter of headlight.However, curved surface beam splitter is positioned to quite remote from image source, come with reducing From light source then by the angle of the light of beam splitter reflection to image source.Moreover, only providing light on the side of headlight, therefore beam splitting The size of device must be big at least as image source.As a result, when being measured along optic axis, with the irradiation area phase in image source Than headlight overall dimension is still relatively large.
There are still the demand of headlight is provided for the display with reflectogram image source, this headlight is mentioned with seldom scattering light It is for good picture quality and still compact, efficient and light weight.
The present invention provides compact, efficient and light weight headlight in display unit, which includes part reflection table Face is to be redirected to reflectogram image source for the irradiation light from side light source, wherein such as the elevation carrection according to diffusion body region The size of display unit is more much smaller than the width of illuminated reflectogram image source.In certain embodiments, partially reflecting surface can It is curved to focus the light from light source or focus on reflectogram image source.Light source can be polarized, and polarization beam apparatus film can It is used as curved partially reflecting surface, so that irradiation light is redirected and the image light reflected is allowed through and passes to Image forming optics.Polarization beam apparatus film is light weight, and has the scattering of low-down light, because image light is going to display light It learns by headlight in the way of device, therefore picture quality is kept.
In other embodiments of the invention, light source is arranged in the opposite sides of headlight, so as to reflected image Two opposite edges in source provide light.In this case, partially reflecting surface is made of two surfaces, and one of surface makes Irradiation light from a light source deflects the half to image source, and another surface make light deflect to image source the other half.? In this embodiment, partially reflecting surface can be bending or flat.
In another embodiment of the present invention, partially reflecting surface is polarization beam apparatus and light source is polarized, therefore is come from The light of light source is redirected by polarization beam apparatus first, is then launched after being reflected by reflectogram image source and changing polarization.
In another embodiment, the light from light source is not polarized, therefore a polarization shape of polarization beam apparatus reflected light State emits another polarization state of light to irradiate the half of reflectogram image source.Before the polarization state of the light emitted passes to The opposite side of lamp, light is recycled there.The recycling of the polarization state emitted can be accomplished by the following way: by four / mono- wave film and by lens reflecting, so that it is transmitted back to by quarter-wave film and thus changes polarization state. After the polarization state for the light for emitting and reflecting is changed, light is redirected by polarization beam apparatus to irradiate reflectogram image source The other half.In an alternate embodiment, the light from two side lamps of headlight is according to work in complementary fashion, wherein what is emitted comes from phase The polarization state of the light of opposite side becomes unpolarized when it is interacted with diffuser on opposite sides, and thus recycles.
In one more embodiment of the present invention, the method for manufacturing the headlight with flexible portion reflectance coating is provided. Flexible membrane can be supported at edge and independently can be sandwiched in transparent two without support or flexible membrane on reflectogram image source Or between multiple solid members.Solid member can be shaped being placed in before flexible membrane contacts.Solid member can be according to flat geometric form Shape or curved geometric keep flexible membrane.In another embodiment, flexible membrane can be supported at edge, then solid member Can be cast on the spot, so that flexible membrane is embedded in transparent solid material.
In one embodiment, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes that user is logical Cross optics assembly, the integrated place for process content to show to user that it checks ambient enviroment and shown content Manage device, the integrated image source for content to be introduced to optics assembly;The processor is suitably modified to content, wherein modification is in response to It inputs and makes in sensor.Content can be video image.Modification can be it is following at least one: adjustment brightness, Adjust color saturation, adjustment colour balance, adjustment tone, adjustment video resolution, adjustment transparency, adjustment compression ratio, adjustment Frame per second per second, a part that video is isolated stop playing video, suspend video or restart video.Sensor input can be from Following at least one of which obtains: charge-coupled device, black silicon sensor, IR sensor, acoustic sensor, inductive pick-up, It is motion sensor, optical sensor, opacity sensor, proximity sensor, inductance sensor, eddy current sensor, passive red Outer proximity sensor, radar, capacitance sensor, capacitive displacement transducer, hall effect sensor, Magnetic Sensor, GPS sensing Device, thermal imaging sensor, thermocouple, thermistor, photoelectric sensor, ultrasonic sensor, infrared laser sensor, inertia motion Sensor, MEMS internal motion sensor, ultrasonic 3D motion sensor, accelerometer, dipmeter, force snesor, piezoelectric transducer, Rotary encoder, linear encoder, chemical sensor, ozone sensor, smoke sensor device, heat sensor, magnetometer, titanium dioxide Carbon detector, carbon monoxide detector, oxygen sensor, glucose sensor, smoke detector, metal detector, raindrop pass Sensor, altimeter, GPS, to whether in external detection, to the detection of context, to movable detection, object detector (for example, billboard), sign detector (for example, for making geographical location marker of advertisement), laser range finder, sonar, capacitor, Photoresponse, heart rate sensor or RF/ micropower impulse radio (MIR) sensor.It may be in response to from the head about user just Stop broadcasting content in the instruction that mobile accelerometer inputs.Audio sensor input can be by least one ginseng of video conference It is generated with speaking for person.Visual sensor can be the video image or vision demonstration of at least one participant of video conference Video image.Modification can be in response to the instruction moved about user from sensor and make more or less thoroughly At least one of bright video image.
In one embodiment, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes that user is logical Cross optical component, the integrated processing for process content to show to user that it checks ambient enviroment and shown content Device, the integrated image source for content to be introduced to optical component, the processor are suitably modified to content, wherein modification is in response in biography Sensor is inputted and is made;And the system further includes integrated video image capture mechanism, which records a side of ambient enviroment Face simultaneously provides content to show.
By read following embodiment the detailed description and the accompanying drawings, these and other system of the invention, method, purpose, Feature and advantage will be apparent to those skilled in the art.
Above-mentioned all documents are contained in this by quoting its entirety.Unless otherwise expressly stated or from text On clear from otherwise should be read to include the project of plural form to the reference of the singular of project, vice versa.It removes It is non-in addition to explicitly point out or from the context clear from otherwise grammer conjunction is intended to express combined subordinate clause, sentence Any and all turnovers of son, word etc. and the combination of connection.
Brief description
The present invention and below to the detailed description of its some embodiments can refer to the following drawings understand:
Fig. 1 depicts the illustrative embodiments of Optical devices.
Fig. 2 depicts RGB LED projector.
Fig. 3 depicts the projector in use.
Fig. 4 depicts the embodiment of the waveguide being placed in frame and correction lens.
Fig. 5 depicts the design of waveguide eyepiece.
Fig. 6 depicts the embodiment of the eyepiece with perspective lens.
Fig. 7 depicts the embodiment of the eyepiece with perspective lens.
Fig. 8 A-C depict according to turn over/under turn over configuration arrange eyepiece embodiment.
Fig. 8 D-E depicts the embodiment of the buckling element of secondary optics.
Fig. 8 F depict turn over/under turn over the embodiment of electro-optical module.
Fig. 9 depicts the electrochromic layer of eyepiece.
Figure 10 depicts the advantages of eyepiece is in terms of Real-time image enhancement, keystone correction and virtual perspective correction.
Figure 11 depicts the chart of the responsiveness comparison wavelength of three substrates.
Figure 12 illustrates the performance of black silicon sensor.
Figure 13 A depicts existing night vision system, and Figure 13 B depicts night vision system of the invention, and Figure 13 C illustrates two The difference of responsiveness between person.
Figure 14 depicts the haptic interface of eyepiece.
Figure 14 A depicts the movement in the embodiment for characterizing the eyepiece for control of nodding.
Figure 15 depicts the finger ring of control eyepiece.
Figure 15 AA depicts the finger ring of eyepiece of the control with integrated camera, wherein allowing user in one embodiment Themselves a part of video image as video conference is provided.
Figure 15 A depicts the handset type sensor in the embodiment of virtual mouse.
Figure 15 B depicts the facial actuation sensor device being mounted on eyepiece.
Figure 15 C depicts the finger point control of eyepiece.
Figure 15 D depicts the finger point control of eyepiece.
Figure 15 E depicts the example of eye tracking control.
Figure 15 F depicts the hand location control of eyepiece.
Figure 16 depicts the location-based application model of eyepiece.
Figure 17 shows A) flexibility of the non-cooled formula cmos image sensor that is able to carry out VIS/NIR/SWIR imaging is flat Platform and B) difference of picture quality between Image enhancement night vision system.
Figure 18 depicts the customization billboard for enabling augmented reality.
Figure 19 depicts the customization advertisement for enabling augmented reality.
Figure 20 depicts the customization illustration for enabling augmented reality.
Figure 20 A depicts the method for delivering message to send when viewer reaches a certain position.
Figure 21 depicts the replacement arrangement of eyepiece optics and electronic device.
Figure 22 depicts the replacement arrangement of eyepiece optics and electronic device.
Figure 22 A depicts eyepiece with the example that eyes shine.
Figure 22 B depicts the cross section with the eyepiece for reducing the luminous light control element of eyes.
Figure 23 depicts the replacement arrangement of eyepiece optics and electronic device.
Figure 24 depicts the latched position of dummy keyboard.
Figure 24 A depicts the embodiment of the virtual projection image on anthropolith.
Figure 25 depicts the detailed view of projector.
Figure 26 depicts the detailed view of RGB LED module.
Figure 27 depicts gaming network.
Figure 28 depicts the method for using augmented reality glasses to carry out game.
Figure 29 depicts the exemplary electronic circuit figure for augmented reality eyepiece.
Figure 29 A depicts the control circuit of the eye tracking control for external equipment.
Figure 29 B depicts the communication network between the user of augmented reality eyepiece.
The parts of images that Figure 30 depicts eyepiece progress removes.
Figure 31 is depicted based on the method for this person is identified such as the speech of a people of the microphone of augmented reality equipment capture Flow chart.
Figure 32 is depicted for the typical camera used in video call or meeting.
Figure 33 shows the embodiment of the block diagram of video call camera.
Figure 34 depicts the embodiment of optics or digital stable eyepiece.
Figure 35 depicts the embodiment of classical Cassegrain configuration.
Figure 36 depicts the configuration of microcaloire Cassegrain telescopic folding optical camera.
Figure 37 depicts the stroke process of dummy keyboard.
Figure 38 depicts the target label process of dummy keyboard.
Figure 38 A depicts the embodiment of vision word translater.
Figure 39 illustrates the glasses for biometric data capture according to an embodiment.
Figure 40 illustrates the iris recognition using biometric data capture glasses according to an embodiment.
Figure 41 depicts the face and iris recognition according to an embodiment.
Figure 42 illustrates the use of double omni-directional microphones according to an embodiment.
Figure 43 is depicted to be improved using the directionality of multiple microphones.
Figure 44 shows the use that audio capture mechanism is controlled according to the adaptive array of an embodiment.
Figure 45 shows the mosaic finger and palm register system according to an example embodiment.
Figure 46 illustrates the traditional optical method as used in other fingerprints and palmmprint system.
Figure 47 shows the method according to used in the mosaic sensor of an example embodiment.
Figure 48 shows the device layout of the mosaic sensor according to an example embodiment.
Figure 49 illustrates camera fields of view used in mosaic sensor according to another embodiment and multiple cameras.
Figure 50 shows the biological phone and tactile computer according to an embodiment.
Figure 51, which is shown, is capturing potential fingerprint and palmmprint side according to the biological phone and tactile computer of an embodiment The use in face.
Figure 52 shows typical DOMEX set.
Figure 53 show the biometric image captured according to an embodiment using biological phone and tactile computer and Relationship between bioassay watch list.
Figure 54 shows the pocket biological tool external member according to one embodiment.
Figure 55 shows each component of the pocket biological tool external member according to one embodiment.
Figure 56 is depicted according to the fingerprint of an embodiment, palmmprint, geographical location and POI registering apparatus.
Figure 57 is shown according to an embodiment for multi-modal bioassay collection, mark, geographical location and POI registration System.
Figure 58 illustrates the forearm wearable device registered according to the fingerprint of an embodiment, palmmprint, geographical location and POI.
Figure 59, which is shown, registers suite of tools according to the mobile folding bioassay of one embodiment.
Figure 60 is the high level system diagram that suite of tools is registered according to the bioassay of one embodiment.
Figure 61 is the system diagram according to the folding bioassay registering apparatus of one embodiment.
Figure 62 shows the film fingerprint and palmprint sensor according to an example embodiment.
Figure 63 is shown to be collected according to the bioassay of an example embodiment collected for finger, palm and registration data Equipment.
Figure 64 is illustrated to be captured according to the two-stage palmmprint of an embodiment.
Figure 65 illustrates the capture of the finger tip tapping according to an embodiment.
Figure 66 illustrates the capture of the bat print and roll printing according to an embodiment.
Figure 67 depicts the system for acquiring non-contacting fingerprint, palmmprint or other bioassay lines.
Figure 68 depicts the process for acquiring discontiguous fingerprint, palmmprint or other bioassay lines.
One embodiment of Figure 69 description wrist-watch controller.
Figure 70 A-D depicts the embodiment example of eyepiece, including the ability to charge with integrative display.
Figure 71 depicts the embodiment of grounding rod data system.
Figure 72 depicts the block diagram of the control mapped system including eyepiece.
Figure 73 depicts bioassay flashlight.
The helmet that Figure 74 depicts eyepiece wears version.
Figure 75 depicts the embodiment of situational awareness glasses.
Figure 76 A depicts 360 ° of imagers of assembling, and Figure 76 B depicts the cross section view of 360 ° of imagers.
Figure 77 depicts the multiple decomposition view for closing view camera.
Figure 78 depicts flight eye.
Figure 79 depicts the decomposition plan view of eyepiece.
Figure 80 depicts the electrooptics assembly of decomposition.
Figure 81 depicts the decomposition view of the axis of electrooptics assembly.
Figure 82 depicts an embodiment of the optical presentation system using the planar illumination tool with reflective display.
Figure 83 depicts the constructive embodiment of planar illumination optical system.
Figure 84 depicts planar illumination tool and inhibits the embodiment group of the reflective display of component with LASER SPECKLE Dress.
Figure 85 depicts the embodiment of the planar illumination tool with the flute profile feature for redirecting light.
Figure 86 depict with pairs of flute profile feature and ' anti-flute profile ' feature to be to reduce the planar illumination of image aberration The embodiment of tool.
Figure 87 depicts the embodiment of the planar illumination tool manufactured from laminar structure.
Figure 88 depicts the embodiment of the planar illumination tool with the wedge-shaped optical component for redirecting light.
Figure 89 depicts the block diagram of irradiation module according to an embodiment of the invention.
Figure 90 depicts the block diagram of optical frequency converter according to an embodiment of the invention.
Figure 91 depicts the block diagram of laser irradiation module according to an embodiment of the invention.
Figure 92 depicts the block diagram of LASER Illuminator System according to another embodiment of the present invention.
Figure 93 depicts the block diagram of imaging system according to an embodiment of the invention.
Figure 94 A and B depict according to top view and side view saturating with photochromic element and heating element respectively Mirror.
Figure 95 depicts the embodiment of LCoS headlamp designs.
Figure 96 depicts the optical bonding prism with polarizer.
Figure 97 depicts the optical bonding prism with polarizer.
Figure 98 depicts multiple embodiments of LCoS headlamp designs.
Figure 99 depicts the chock being superimposed upon on LCoS plus OBS.
Figure 100 depicts two versions of chock.
Figure 101 depicts the bending PBS film on LCoS chip.
Figure 102 A depicts one embodiment of optics assembly.
Figure 102 B depicts one embodiment of the optics assembly with embedded camera.
Figure 103 depicts one embodiment of image source.
Figure 104 depicts one embodiment of image source.
Figure 105 depicts all embodiments of image source.
Figure 106 shows software application tool in terms of depicting function and control in conjunction with the eyepiece in one embodiment of the invention With the top-level block diagram in market.
Figure 107 depicts the functional block diagram of the eyepiece Application development environ-ment in one embodiment of the invention.
Figure 108 is depicted in one embodiment of the invention and is developed stack about the platform element of the software application of eyepiece.
Figure 109 is the diagram of the head-mounted display according to an embodiment of the invention with see-through capabilities.
Figure 110 is the figure to the view for the unmarked scene such as checked by head-mounted display discribed in Figure 109 Show.
Figure 111 is the diagram of the view of the scene of Figure 110 with 2D superposition label.
Figure 112 is as shown in the 3D of the Figure 111 shown to the left eye of viewer label.
Figure 113 is as shown in the 3D of the Figure 111 shown to the right eye of viewer label.
Figure 114 is the diagram superposed on one another to show the left and right 3D label of different Figure 111.
Figure 115 is the diagram of the view of the scene of Figure 110 with 3D label.
Figure 116 be capture, the diagram of the stereo-picture of the scene of Figure 110.
Figure 117 is the diagram to show the superimposed left and right stereo-picture of different Figure 116 between image.
Figure 118 is the diagram for showing the scene of Figure 110 of 3D label of superposition.
Figure 119 is the flow chart of the Depth cue embodiment of the method for providing 3D label of the invention.
Figure 120 is the flow chart of another Depth cue embodiment of the method for providing 3D label of the invention.
Figure 121 is the flow chart of the another Depth cue embodiment of the method for providing 3D label of the invention.
Figure 122 is the flow chart of another Depth cue embodiment of the method for providing 3D label of the invention.
Figure 123 A is depicted for providing display order frame will pass through the processor that display component performs image display.
Figure 123 B depicts the display interface for being configured to remove display driver.
Figure 124 is the schematic diagram with the prior art waveguide of multiple portions reflector;
Figure 125 is the schematic diagram of the waveguide with the changeable eyeglass of multiple electricity in first position;
Figure 125 A is the diagram of the Waveguide assembly with electrical connection.
Figure 126 is the schematic diagram of the waveguide in the second position with the changeable eyeglass of multiple electricity;
Figure 127 is the schematic diagram of the waveguide with the changeable eyeglass of multiple electricity in the third place;
Figure 128 is the schematic diagram of the waveguide with multiple mechanical changeable eyeglasses in first position;
Figure 128 A is the schematic diagram with the Waveguide assembly of micro-actuator and associated hardware;
Figure 129 is the schematic diagram of the waveguide in the second position with multiple mechanical changeable eyeglasses;
Figure 130 is the schematic diagram of the waveguide with multiple mechanical changeable eyeglasses in the third place;
Figure 131 A and Figure 131 B are the diagrams of the Waveguide display with changeable eyeglass in user's face;And
Figure 132 A-132C is the diagram of the display area provided for the user with different eye spacing.
Figure 133 is the schematic diagram with the reflectogram image source of edge light and headlight, the signal illustrate by light;
Figure 134 be include slot prior art headlight schematic diagram;
Figure 135 is the signal in the prior art headlight including plane polarization beam splitter and curvature reflectors of solid slug Figure;
Figure 136 is the schematic diagram of the one embodiment of the invention for having single edge light and being bent wire-grid polarizer film;
Figure 137 is that there are two the schematic diagrames of edge light and one embodiment of the invention of bending wire-grid polarizer film for tool;
Figure 138 is the schematic diagram that the side frame of flexible wire-grid polarizer film is kept according to required curved shape;
Figure 139 is the flow chart of method of the invention.
Figure 140 is the schematic diagram with the nearly eye imaging system of beam splitter;
Figure 141 is the schematic diagram for the optical module of nearly eye imaging system;
Figure 142 is the diagram of film pattern optical sheet;
Figure 143 is the diagram with the insertion molding modules shell of built-in optical piece;
Figure 144 is the diagram for being laminated the compression molding of pattern optical sheet;
Figure 145 A-C is the diagram for applying optical film in molding modules shell.
Figure 146 depicts the schematic front perspective view of the AR eyepiece (without its temple) according to one embodiment of the disclosure.
Figure 147 depicts the signal back perspective view of the AR eyepiece of Figure 146.
Figure 148 depicts the signal back portion perspective view on the right side of the wearer of the AR eyepiece of Figure 146.
Figure 149 depicts the signal back portion perspective view on the right side of the wearer of the AR eyepiece of Figure 146.
Figure 150 depicts the perspective illustration that the component of AR eyepiece of one of projection screen is used to support shown in Figure 146.
Figure 151 depicts the perspective illustration of the adjustment platform of AR eyepiece shown in Figure 146.
Figure 152 depicts the perspective illustration of the component of the transverse adjusting mechanism of AR eyepiece shown in Figure 146.
Figure 153 depicts the perspective illustration of the component of the tilt adjusting mechanism of AR eyepiece shown in Figure 146.
Figure 154 is the chart for showing the dark adaptation curve of human eye.
Figure 155 is to show the chart for gradually decreasing influence of the illumination to the dark adaptation curve of human eye.
Figure 156 is the diagram with the head-mounted display of see-through capabilities.
Figure 157 is the figure for showing the relationship when entering dark surrounds between display brightness and time.
Figure 158 is the flow chart of shade adaptation method.
Figure 159 depicts the dummy keyboard presented in the user visual field.
Figure 160 depicts the example of the display system with optically flat reflecting surface.
Figure 161 shows the diagram of nearly eye display module.
Figure 162 shows the diagram of optical device associated with the type of head-mounted display.
Figure 163 is shown in which to be added to the diagram of baffle between illumination beam splitter device and lens inside the shell.
Figure 164 is shown in which the figure in lens into the another embodiment of the present invention for being added to baffle at surface Show.
Figure 165 is shown in which to be added to the diagram of the another embodiment of the present invention of baffle at the output of lens.
Figure 166 is shown in which that baffle is attached to another implementation of the invention of shell between lens and imaging beamsplitter The diagram of example.
Figure 167 is shown in which to apply the side wall of shell the diagram of the another embodiment of the present invention of absorber coatings.
Figure 168 shows the diagram in another source of stray light in head-mounted display, and wherein stray light is directly from light source Edge enters.
Figure 169 describes the stray light of any reflecting surface reflection from the shell or edge of lens.
Figure 170 is shown in which to be disposed adjacent the diagram of the one more embodiment of the present invention of baffle with light source.
Figure 171 depicts the absorber coatings that protuberance (ridge) can be used, wherein a series of small protuberances or step take on a system Rim ray in entire sidewall areas of the column baffle to stop or trim shell.
Figure 172 shows the another embodiment of belt or thin slice, and belt or thin slice include the slide glass that can be used for blocking reflected light And protuberance.
Figure 173 depicts the decomposition view of an embodiment of glasses.
Figure 174 depicts the wiring design and wire guide of glasses.
Figure 175 depicts the amplified version of the wiring design and wire guide of glasses.
Figure 176 A shows the cross section view of the wiring design and wire guide of glasses.
Figure 176 B shows the cross section view of the wiring design and wire guide of glasses.
Figure 176 C shows the full release of the wiring design and wire guide of glasses.
Figure 177 depicts the U-shaped attachment for fixing glasses.
Figure 178 depicts the embodiment of the cable tension system on the head for glasses to be fixed to user.
Figure 179 A and Figure 179 B depict the cable tension on the head for glasses to be fixed to user according to bending configuration The embodiment of system.
Figure 180 depicts the embodiment of the cable tension system on the head for glasses to be fixed to user.
Figure 181 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 182 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 183 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 184 depicts the embodiment of the system on the head for glasses to be fixed to user.
Figure 185 A depicts one embodiment of optical element string.
Figure 185 B depicts the sample ray-traces of light in one embodiment of optical element string.
Figure 186 depicts the embodiment that LCoS adds ASIC packet.
Figure 187 is the signal diagram of the prior art headlight using single source and beam splitter block;
Figure 188 is the signal diagram of the prior art headlight using single source and reflective beam splitter layer;
Figure 189 is illustrated using the signal of the headlight of single source, and wherein planar reflective beam splitter layer is with reduced angle It is placed;
Figure 190 is illustrated using the signal of the headlight of single source, and wherein reflective beam splitter layer is curved;
Figure 191 is illustrated using the signal of the headlight of double light sources, wherein the folding mirror beam splitter film with flat surfaces It is placed in transparent solid;
Figure 192 is illustrated using the signal of the headlight of double light sources, wherein independent without branch using the folding with flat surfaces Support reflective beam splitter film;
Figure 193 is illustrated using the signal of the headlight of double light sources, wherein independent without branch using the folding with curved surface Support reflective beam splitter film;
Figure 194 is illustrated using the signal of the headlight of double light sources, wherein the folding mirror beam splitter film with curved surface It is placed in transparent solid;
Figure 195 is illustrated using the signal of the headlight of single source, and headlight has opposite eyeglass and quarter-wave Film is to recycle a part of polarised light, wherein having the folding mirror beam splitter film of flat surfaces to be placed in transparent solid In;
Figure 196 is illustrated using the signal of the headlight of single source, and headlight has opposite eyeglass and quarter-wave Film is provided with independent without support folding mirror polarization beam apparatus with flat surfaces with recycling a part of polarised light Film;
Figure 197 is illustrated using the signal of the headlight of single source, and headlight has opposite eyeglass and quarter-wave Film is provided with independent without support folding mirror polarization beam apparatus with curved surface with recycling a part of polarised light Film;
Figure 198 is to manufacture headlight shown in such as Figure 197 but there is the folding mirror beam splitter film of flat surfaces to be placed in The signal of method in transparent solid illustrates, wherein top and bottom film retainer be used to shape to reflective beam splitter film and determine Position, and each section of polarised light is recycled;
Figure 199 is to be manufactured using method shown in Figure 198, be used together with double light sources with the recycle sections of polarised light Headlight signal diagram;
Figure 200 is to fold the independent signal diagram without support reflective beam splitter film, and the film is for solid headlight of casting It is supported on edge in the first step of method;
Figure 20 1 is to show to remove gas side by side for injecting transparent cast material in the method for solid headlight of casting The signal in hole illustrates;
Figure 20 2 is the signal diagram for showing the casting on top of casting solid headlight;
Figure 20 3 is the signal diagram for showing using flat transparent thin slice the top for flattening casting solid headlight;
Figure 20 4 is the flow chart for the method by assembly solid headlight;
Figure 20 5 is the flow chart for the method by casting manufacture solid headlight;And
Figure 20 6 is the flow chart for using the method for multistep molding process manufacture solid film retainer.
Figure 20 7 depicts an embodiment of near-field communication wrist-watch.
Figure 20 8 depicts the embodiment with the near-field communication wrist-watch for the service point equipment interconnection for enabling near-field communication.
Figure 20 9 depicts the near-field communication hand docked with the smart phone of the service point device and user that enable near-field communication One embodiment of table.
Detailed description
The present invention relates to eyepiece Electrooptical devices.Eyepiece may include being suitable for projecting image onto perspective or translucent lens Projection optical device, so that the wearer of eyepiece be allowed to check the environment and shown image of surrounding.Also referred to as throw The projection optical device of shadow instrument may include the RGB LED module using field sequential color.Using field sequential color, single full color image can Colour field is broken down into based on primary colors red, green and blue, and by LCoS(liquid crystal over silicon) the individually imaging of optical display 210.Because Each colour field is imaged by optical display 210, and corresponding LED color is opened.When these colour fields are shown according to rapid sequence When showing, full color image can be seen.It is irradiated using field sequential color, it can be by mobile red relative to blue and/or green image Image etc. is come the image that projects obtained in any chromatic aberation adjustment eyepiece.A pair of freedom can be reflected to after image In curved surface waveguide, wherein image light participates in total internal reflection (TIR) until the user of arrival lens sees that area is checked in the activity of image Domain.It may include the controllable LED light source of processor and optical display of memory and operating system.Projector may include or in light It is coupled to display coupled lens, condenser lens, polarization beam apparatus and field lens on.
With reference to Figure 123 A and 123B, processor 12302(is for example, digital signal processor) it can provide display order frame 12324 for use by the display component 12328(of eyepiece 100 for example, LCOS display component) perform image display.In each embodiment In, sequence frames 12324 can be used or the intermediary component that is not employed as between processor 12302 and display component 12328 it is aobvious Show driver 12312 to generate.For example, and refer to Figure 123 A, processor 12302 may include that frame buffer zone 12304 and display connect Mouthful 12308(is for example, Mobile Industry Processor Interface (MIPI), and display serial line interface (DSI)).Display interface 12308 can It will be supplied to by the RGB data 12310 of pixel as the aobvious of the intermediary component between processor 12302 and display component 12328 Show driver 12312, wherein display driver 12312 receives by pixel RGB data 12310 and generates for the independent complete of red Frame shows data 12318, shows data for the independent full frame display data 12320 of green and the independent full frame for blue 12322, display order frame 12324 is thus supplied to display component 12328.In addition, display driver 12312 can be to display group Part 12328 provides timing signal, such as so as to the synchronous full frame 12318,12320,12322 as display order frame 12324 Transmitting.In another example, and Figure 123 B is referred to, display interface 12330 can be configured to by direct to display component 12328 It provides and shows data 12334 for red full frame, show data 12338 and the full frame for blue for green full frame Display data 12340 remove display driver 12312 as display order frame 12324.In addition, timing signal 12332 can be from Display interface 12330 is supplied directly to display component.This configuration can by remove demand to display driver provide it is aobvious Write lower power consumption.The demand to driver not only can be removed in this direct faceplate formation, but also the totality that can simplify configuration is patrolled Volume, and remove Pixel Information etc. is generated to faceplate formation of the change from pixel, from frame needed for redundant memory.
With reference to Figure 186, in embodiments, in order to improve the yield of LCoS+ASIC packet 18600, ASIC be may be mounted to that On flexible print circuit (FPC) 18604, there is reinforcing device on top.If the reinforcing device on top is high as ASIC, will not make total Body packet increases thickness.FPC can be connected or for the plate compared with high pin count to plate via connector 18602(such as zero insertion force (ZIF) Connector) it is connected to standard LCoS packet (such as glass fiber reinforced epoxy resin laminate (FR4) 18608).It can be used pressure-sensitive viscous ASIC, reinforcing device and LCoS are adhered to FPC by mixture.
With reference to Fig. 1, the illustrative embodiments of augmented reality eyepiece 100 can be depicted.It is understood that the embodiment of eyepiece 100 It may not include the discribed whole elements of Fig. 1, and other embodiments may include additional or different element.In embodiments, Optical element can be typed in the temple part 122 of eyepiece frame 102.Available projector 108, which projects image onto, is placed in frame On at least one lens 104 in the opening of frame 102.Such as receive projector, skin projector, micro-projector, millimicro micro-projector, The one or more projector 108 such as projector, holographic projector based on laser can be placed in the temple part of eyepiece frame 102 In.In embodiments, two lens 104 are perspectives or translucent, and in other embodiments, only one lens 104 It is translucent and another lens is opaque or missing.It in embodiments, may include being more than in eyepiece 100 One projector 108.
In each embodiment of the discribed embodiment of such as Fig. 1, it is articulate that eyepiece 100 may also include at least one Earphone 120, wireless set 118 and radiator 114, the radiator 114 are used to absorb the heat from LED light engine, with So that LED light engine is kept nice and cool and it is allowed to work under full brightness.There is also the open multimedias of one or more TI OMAP4( Application processor) 112, the winding displacement 110 with RF antenna, they will be described in greater detail herein.
In one embodiment and Fig. 2 is referred to, projector 200 can be RGB projector.Projector 200 may include shell 202, radiator 204 and RGB LED engine or module 206.RGB LED engine 206 may include LED, dichroic device, concentrator Deng.Image or video flowing can be converted into control signal, such as voltage landing/electricity by digital signal processor (DSP) (not shown) Rheology, pulse width modulation (PWM) signal etc., to control intensity, duration and the mixing of LED light.For example, DSP is controllable The duty ratio of each pwm signal is made to control the average current for flowing through the every LED for generating multiple colors.The static image of eyepiece Noise filtering, image/video stabilization and face detection can be used in coprocessor, and is able to carry out image enhancement.Eyepiece Buffering, SRC, equilibrium etc. can be used in audio back-end processor.
Projector 200 may include the optical displays such as LCoS display 210 and multiple components as shown in the figure.? In each embodiment, projector 200 is designed to using single sided board LCoS display 210;However, three panel displays are also can Can.In single sided board embodiment, display 210 is irradiated with red, blue and green (i.e. field sequential color) in order.In other embodiments In, replacement optics display technology, such as backlight liquid crystal display (LCD), front lighting LCD, semi-transparent reflection can be used in projector 200 Formula LCD, organic power generation diode (OLED), field-emitter display (FED), ferroelectricity LCoS(FLCOS), installation on sapphire Liquid crystal technology, transparent liquid crystal micro-display, quantum dot displays etc..
In embodiments, display can be 3D display device, LCD, thin film transistor (TFT) LCD, LED, LCOS, ferroelectricity on silicon Liquid crystal display, CMOS display, OLED, QLED, on the crosspoint between OED pixel with CMOS formula element sensor OLED array, transmission-type LCoS display, CRT monitor, VGA display, SXGA display, QVGA display, have be based on The display of the gaze tracker of video, the display with exit pupil expansion technique, Asahi film display, free form surface Optical display, XY Polynomial combination device display, light guide transmission display, Amoled display etc..In embodiments, it shows Show that device can be the holographic display device for allowing eyepiece that the image from image source is shown as to hologram.In embodiments, it shows Show that device can be liquid crystal reflective micro-display.Such display may include polarization optics, and micro- aobvious with certain OLED Show that device is compared, brightness can be improved.In embodiments, display can be free curved surface prism display.Free curved surface prism Display can realize 3D three-dimensional imaging ability.In embodiments, display can be distinguished with Cannon and/or Olympus company Those displays described in United States Patent (USP) 6,384,983 and 6,181,475 are similar or identical.In other embodiments, it shows Show that device may include the gaze tracker based on video.In embodiments, the light beam of infrared light supply can be in exit pupil expander (EPE) it separates and expands in, to generate the collimated light beam from EPE towards eyes.Cornea can be imaged in miniature video camera, and eye Eyeball gaze-direction can be calculated by the flash of light of positioning pupil and infrared beam.After user's calibration, tracked from staring The data of device can reflect the user focus in displayed image, this is used as input equipment.Such equipment can be similar to sweet smell Those equipment provided by Nokia, blue Tampere city research center.In addition, in embodiments, display may include outgoing pupil Diameter expander, it amplifies exit pupil and by image transmitting to new position.It is thus possible to only need the placement at the moment in user thin Slide, and image source can be placed elsewhere.In a further embodiment, display can be off-axis optics and show Device.In embodiments, such display can not be overlapped with the machine center of aperture.This avoids key light circles by auxiliary optical Element, kit and/or sensor block, and can provide the use to kit and/or sensor in focal point.For example, having The pixel for being referred to as PenTile from Nouvoyance company can be used in source matrix Organic Light Emitting Diode (Amoled) display Design, the design in various ways pass through more light.Firstly, red, blue and green sub-pixel is than the sub- picture in traditional monitor It is plain big.Secondly, having a sub-pixel in every four sub-pixels is clear (clear).This means that backlight can be used it is less Power simultaneously brighter shines.Less sub-pixel generally means that lower resolution ratio, but PenTile display uses individually Sub-pixel cheat the eyes to perceive identical resolution ratio when using the about one third of the sub-pixel of RGB stripe line panel. PenTile display also determines the brightness of scene using image processing algorithm, dims backlight automatically to darker image.
In order to overcome the limitation of the prior art described before, the present invention provides the changeable eyeglass entirety battle array in waveguide Column, these eyeglasses can be used sequentially to provide the progressive scan to each section of image on the display visual field.By in sequence Mode eyeglass is promptly switched to transmission-type from reflective, image can be supplied to use in the case where can not perceive flashing Family.Since compared with reflective condition, each changeable eyeglass is more often in transmissive state, the array that eyeglass can be switched is aobvious to user Shown image is also presented to user while being now transparent.
Waveguide is well known to those skilled in the art to the presentation of the light from image source, therefore will not be begged for herein By.The exemplary references of the transmission of waveguide and light from image source to display area mention in United States Patent (USP) 5076664 and 6829095 For.The present invention includes the method and apparatus for redirecting image light in the waveguide to provide a user image, wherein in waveguide Image light be to be provided from image source.
Figure 125 shows waveguide and shows equipment 12500, it have redirect it is being transmitted by waveguide 12510, from image source 12502 light is to provide a user the integral array of the changeable eyeglass 12508a-12508c of image light 12504.It shows Three changeable eyeglass 12508a-12508c, but in the present invention, array may include different number of changeable eyeglass.Figure It is the changeable eyeglass of electricity for including liquid crystal switchable eyeglass that eyeglass can be switched shown in 125.Coverslip 12512 is provided in quilt It is shown as in the thin layer of changeable eyeglass 12508a-12508c comprising liquid crystal material.12514 He of power supply line is also shown in Figure 125 12518。
The integral array of waveguide 12510 and changeable eyeglass 12508a-12508c can be made of plastics or glass material, only Will the material it is suitably flat.The uniformity of thickness is not as good as important in most of liquid crystal apparatus, because changeable eyeglass has High reflectance.The construction of switchable liquid crystal eyeglass describes in United States Patent (USP) 6999649.
In terms of Figure 126 and 127 shows the sequence in the present invention, because one of eyeglass place can be switched in a moment only array In reflective condition, other changeable eyeglasses in array are then in transmissive state.Figure 125 is shown in reflective condition The first changeable eyeglass 12508a, thus redirect the light from image source 12502 with become to user present image one Partial image light 12504.Other changeable eyeglass 12508b and 12508c are in transmissive state.Waveguide is also shown in Figure 124 12410。
In Figure 126, changeable eyeglass 12508a and 12508c are in transmissive state, and at changeable eyeglass 12508b In reflective condition.This situation has provided a user image light 12600 and its associated image section.Finally in Figure 127 In, changeable eyeglass 12508a and 12508b are in transmissive state, and changeable eyeglass 12508c is in reflective condition.This One last situation has provided a user image light 12700 and its associated image section.After this last situation, weight It is again the sequence as shown in Figure 125 followed by as shown in Figure 126 as shown in Figure 124, later, to provide sweeping line by line to image It retouches.This is sequentially continuously repeated while user checks shown image.Therefore, from the institute of image source 12502 There is light to be redirected at any given time by single switchable mirror piece in order.Image source can above be mentioned in changeable eyeglass in the visual field Continuous work while for progressive scan to image light 12504.If image light is perceived as brighter or can to different Switch eyeglass there are different colour balances, then image source can be adjusted to that compensation, and image source brightness or colour balance can be adjusted System is with synchronous with the transfer sequence of the array of changeable eyeglass.In another embodiment of the invention, eyeglass switching can be switched Order can be altered to provide a user interlaced picture, the array of eyeglass such as changeable for four according to repetitive mode 1, 3,2,4 order.
Figure 128 shows another embodiment of the present invention, and wherein the integral array of the changeable eyeglass of Mechanical Driven is provided. In this case, waveguide shows that the switchable mirror in equipment 12800 includes prism 12804a-12804c, these prism quilts Movement is to be alternately provided air gap or the respectively optical contact between the 12810a-12810c of surface.As shown in Figure 128, rib Mirror 12804a is moved downward to provide air gap, so that surface 12810a is the reflecting surface to be worked by total internal reflection.Together When, prism 12804b and 12804c are urged towards to provide optical contact at surface 12810b and 12810c respectively, so that It is radioparent for obtaining surface 12810b and 12810c.Light of this situation redirection from image source 12502, which becomes to user, is in The image light 12802 of a part of existing image.In this embodiment, can be switched eyeglass from transmissivity almost 100% optics Contact is mobile be reflectivity almost 100% total internal reflection.Power supply line 12812 is also shown in Figure 128, pedestal is connected with common ground 12814 and micro-actuator 12818a-c.
Figure 129 and 130 is shown for other in the sequence of the changeable eyeglass of Mechanical Driven in switchable mirror chip arrays Situation.In Figure 129, prism 12804a and 12804c are urged towards to provide the light with surface 12810a and 12810c respectively Contact is learned, to provide transmissive state for the light from image source 12502.Meanwhile prism 12804b is moved down on surface Air gap is manufactured at 12810b, so that the light from image source 12502, which is redirected to become to user, is presented the associated of image Partial image light 12900.In the final step of the sequence shown in Figure 130, prism 12804a and 12804b are urged towards On with respectively at surface 12810a and 12810b provide optical contact so that the light from image source by reach surface 12810c.Prism 12804c is moved downward to provide air gap at the 12810c of surface, so that surface 12810c, which becomes, to be had The reflecting surface of total internal reflection, and the light from image source 12502 is redirected as image light 13000 and its associated figure As part.
In discussion before, the situation of total internal reflection is based on waveguide 12808 as known for the skilled artisan Material and air optical property.In order to obtain 90 degree of reflections, the refractive index of waveguide 12808 as shown in Figure 128-130 Have to be larger than 1.42.In order to provide optical contact respectively between prism 12804a-12804c and surface 12810a-12810c, The necessary match surface 12810a-12810c in the surface of prism 12804a-12804c, error is within 1.0 microns.Finally, in order to Make the light from image source 12502 advance through waveguide 12808 and prism 12804a-12804c without at interface deflect, The refractive index of prism 12804a-12804c must be identical as the refractive index of waveguide 12808, and error is within about 0.1.
Figure 131 a and 131b are shown such as Waveguide assembly 13102 and switchable mirror chip arrays included in the present invention Diagram.Figure 131 a shows the side view with the Waveguide assembly 13102 in account, and wherein the long axis of switchable mirror chip arrays is hung down Straight orientation, so that image light 13100 is directed in the eye of user.Figure 131 b is shown with the Waveguide assembly in account 13102 top view, wherein the short axle of switchable mirror chip arrays 13104 can be seen, and image light 13100 is provided to user's Eyes 13110.In Figure 131 a and 131b, the visual field provided in image light 13100 can be seen clearly that.In Figure 131 b, such as The various pieces of the image as provided by changeable eyeglasses different in array can also be seen.Figure 131 b is also shown including image source One embodiment of 13108 Waveguide assembly 13102, wherein there is image source 13108 internal light source to be come from such as with providing Then the light of the miniscope of LCOS display or LCD display, light are transmitted to changeable eyeglass by waveguide, there its quilt Switchable mirror piece redirects and becomes the image light 13100 presented to the eyes 13110 of user.
In order to reduce the image that user is perceived when changeable eyeglass be used to provide a user the sequential partial of image Flashing, frequency work of the changeable eyeglass sequence preferably to be faster than 60Hz.In this case, n switchable mirror in array Each of piece is in reflective condition (1/60) X1/n seconds in each circulation of sequence, is then in transmissive state (1/ 60) X (n-1)/n seconds.Therefore, compared in reflective condition, each circulation of each changeable eyeglass in the sequence it is bigger Transmissive state is in part, therefore the array of changeable eyeglass is perceived as relative transparent by user.
In another embodiment of the invention, the integral array that eyeglass can be switched has than mirror needed for covering display area Eyeglass more can be switched in piece.Additional changeable eyeglass is utilized for the difference with different eye spacing (also referred to as interpupillary distance) User provides adjustment.In this case, the changeable eyeglass that be used to present image to user is adjacent to each other, so that they Continuous image-region is presented.Depending on the eye spacing of user, the changeable eyeglass at array edges is used.As Figure 132 A- Example shown in 132C provides the array 13200 with seven changeable eyeglasses, each eyeglass 3mm wide.In validity period Between, five adjacent changeable eyeglasses are used to provide for the display area (13202a-13202c) of 15mm wide, have between eye Away from ± 3mm adjustment.In the narrow eye spacer conditions shown in Figure 132 A, it is used to show towards five changeable eyeglasses of inner edge Show, and two outsides can be switched eyeglass and be not used by.In the width eye spacer conditions shown in Figure 132 C, towards extrorse five Changeable eyeglass is used to show, and two inside can be switched eyeglass and be not used by.Intermediate state is shown in Figure 132 B, wherein Five intermediate changeable eyeglasses are used, and changeable eyeglass outwardly and inwardly is not used by.In the present invention, term " being not used by " refers to that changeable eyeglass is maintained in transmissive state, and other changeable eyeglasses are according to duplicate transmissive state Sequence between reflective condition is used.
Example
In the first example, using New York, United States Hopewell town Kent Optronics Co., Ltd (http: // Www.kentoptronics.com/) the liquid crystal switchable eyeglass with quick response provided.Waveguide is by glass or plastics system At, liquid crystal by comprising in the space between each layer, so that liquid crystal is 5 microns of thickness.Coverslip includes liquid crystal on the outer surface.It rings It is 10 milliseconds between seasonable, reflectivity is 87% in reflective condition and transmissivity is 87% in transmissive state.Three changeable Eyeglass can drive in the sequence operated with 30Hz.If changeable eyeglass is 5mm wide, the display area of 15mm wide is provided, This is equal to from the eyes from waveguide 10mm in 38 degree of visuals field that the eye movement range (eyebox) of 8mm wide is checked.
In the second example, the Mechanical Driven array of prism made of the glass or plastics that are 1.53 refractive index is provided, The waveguide with refractive index of 1.53 identical material by being made.The surface of prism is polished to provide the plane less than 1 micron Degree, piezoelectric micro-actuator are used for mobile about 10 microns of prism to switch to reflective condition from transmissive state.Waveguide is molded to mention For the mating face of the flatness less than 1 micron to(for) prism.Five changeable eyeglasses can be driven by the piezoelectric actuator with Just it is operated in the sequence operated with 100Hz.Piezoelectric micro-actuator has from Miami, FL city Steiner&Martins Limit company (http://www.steminc.com/piezo/PZ_STAKPNViewPN.asp?PZ_SM_MODEL= SMPAK155510D10 it) buys, micro-actuator provides 10 in the 5X5X10mm packaging driven by 150V with the power more than 200 pounds The movement of micron.The array of respectively the 5 of 5mm wide prisms is used to provide for the display area of 25mm wide, this is equal to from from wave The eyes of 10mm are led in 72 degree of visuals field that the eye movement range of 8mm wide is checked.Alternatively, once being provided using only 3 prisms The display area (38 degree of visuals field) of 15mm wide, the ability with transverse shifting display area ± 5mm is with the eye for different user Different spacing between eyeball are adjusted.
In embodiments, waveguide display systems may include the image source for providing the image light from displayed image, incite somebody to action Image light sends the waveguide of display area to and the image light of self-waveguide in future is redirected to displayed image and can be looked by user The integral array of the changeable eyeglass for the display area seen.In embodiments, changeable eyeglass can be electrically driven.In each implementation In example, changeable eyeglass can be mechanically driven.In other embodiments, micro-actuator can be used for Mechanical Driven switchable mirror Piece.In addition, micro-actuator can be piezoelectricity.Changeable eyeglass can switch between transmission and reflective condition, in viewing area Each section of image light is provided in the progressive scan on domain.
It in embodiments, may include providing the figure from image source to waveguide from the method that waveguide provides displayed image There is provided in waveguide as light, on the display region the integral array of changeable eyeglass and between transmission and reflective condition it is suitable Sequence operates changeable eyeglass to provide each section of image light in progressive scan on the display region.
In a further embodiment, the waveguide display systems with interpupillary distance adjustment may include providing from displayed image The image light of the image source of image light, the waveguide that image light is sent to display area and self-waveguide in future is redirected to display The integral array of the changeable eyeglass of device.In addition, the array of changeable eyeglass can have than eyeglass needed for covering display area More eyeglasses, and the changeable eyeglass at array edges can be used for providing the display area of the eye spacing of matching user.
Eyepiece can be by any power supply power supply, battery power, solar energy, route electric energy etc..Power supply can be incorporated in frame In frame 102 or it is placed in 100 outside of eyepiece and is electrically connected with the element that is powered of eyepiece 100.For example, solar collector can quilt It is placed on frame 102, on belt fastener etc..Battery charge can be used wall charger, onboard charger, on belt fastener, It is carried out in eyepiece box etc..
Projector 200 may include LED light engine 206, which may be mounted to that on radiator 204 and retainer 208, use In the installation without friction for ensuring LED light engine, hollow taper opticaltunnel 220, diffuser 212 and condenser lens 214.Hollow tunnel Road 220 helps the fast-changing light from RGB LED light engine that homogenizes.In one embodiment, hollow light tunnels 220 are wrapped Include silver coating.Diffuser lens 212 further homogenize and mix to light before light is directed into condenser lens 214 It closes.Light leaves condenser lens 214 subsequently into polarization beam apparatus (PBS) 218.In PBS, LED light is shown up being refracted It is transmitted and is divided in polarization components before mirror 216 and LCoS display 210.LCoS display provides figure for micro-projector Picture.Then image reflects from LCoS display and passes back through polarization beam apparatus, then by 90 degree of reflection.Therefore, image is substantially Micro-projector 200 is left in the centre of micro-projector, light is then channeled into coupled lens 504, as described below.
Fig. 2 depicts the embodiment of projection part and its other supporter as described herein, but those skilled in the art Other configurations and optical technology can be used in member.For example, replacing using reflective optical device, such as with sapphire The transparent configuration of substrate can be used for the optical path for realizing projecting apparatus system, therefore potentially change and/or eliminate optical module, Beam splitter, redirection eyeglass etc..System can have back light system, and wherein LED RGB triple, which can be, is oriented to enable The light source that light passes through display.As a result, backlight and display perhaps can with waveguide purlieu install or display it After cylindricality/directing optics may be present to enable light properly enter optical device.If shown without directing optics Device may be mounted to that the top of waveguide, side etc..In one example, small transparent display can be with transparent substrates (for example, blue precious Stone) on silicon active backplane realize that transparent electrode is controlled by silicon active backplane, liquid crystal material, polarizer etc..The function of polarizer It can be and correct depolarizing to improve the contrast of display by the light of system.In another example, system is available Apply the spatial light modulator of some form of modulation with spatial variations to optical path, such as microchannel spatial light modulator, Middle diaphragm mirror optical shutter is based on microelectromechanical-systems (MEMS).System also using other optical modules, filter by such as tunable optical Wave device (for example, there is deformable film actuator), angle of elevation deflected micromirror system, discrete phase optical element etc..
In other embodiments, eyepiece can provide higher using OLED display, quantum dot displays etc., these displays Power efficiency, brighter display, lower-cost component etc..In addition, the display skill of such as OLED and quantum dot displays Art can provide flexible display, therefore allow to reduce the bigger packaging efficiency of eyepiece overall dimension.For example, OLED and quantum dot Display material can be printed in plastic supporting base by stamping technology, therefore obtain Flexible Displays component.For example, OLED(has Machine LED) display can be the display of the flexibility, low-power that do not need backlight.It can be it is curved, such as Standard spectacles mirror Piece is such.In one embodiment, OLED display can be transparent display or provide transparent display.In each embodiment In, irrealizable each resolution levels and equipment size (for example, frame thickness) before high modulation transfer function is permitted Combination.
With reference to Figure 82, eyepiece is available with the associated planar illumination tool 8208 of reflective display 8210, wherein light source 8208 couple 8204 with the edge of planar illumination tool 8208, and wherein the planar side of planar illumination tool 8208 irradiates reflectivity Display 8210, display 8210 provide the content to present to the eyes 8222 of wearer by conductive optics 8212 Imaging.In embodiments, reflective display 8210 can be LCD, LCD(LCoS on silicon), cholesteryl liquid crystal, guest-host type liquid Crystalline substance, polymer dispersed liquid crystals, phase retardation liquid crystal etc. or other liquid crystal technologies known in the art.In other embodiments, instead Penetrating property display 8210 can be bistable display, electrophoresis, electrofluid, electricity wet, dynamic electricity, cholesteryl liquid crystal etc., or Any other bistable display known in the art.Reflective display 8210 is also possible to LCD technology and bistable display The combination of technology.In embodiments, the coupling 8204 between light source 8208 and " edge " of planar illumination tool 8208 can lead to The other surfaces for crossing planar illumination tool 8208 carry out, and are then directed in the plane of planar illumination tool 8208, such as one Begin through top surface, bottom surface, inclination surface etc..For example, light can enter planar illumination tool from top surface, but enter 45 Facet, so that light is made to bent into the direction of plane.In an alternate embodiment, the direction of this light to bent available optical coating real It is existing.
In one example, light source 8202 can be the RGB LED source that direct-coupling 8204 arrives the edge of planar illumination tool (for example, LED array).Then light into the edge of planar illumination tool is directed to reflective display to be imaged, all As described here.Light can enter reflective display to be imaged, then by planar illumination tool (such as using reflectivity The reflecting surface of display back side) it is redirected back.Then light can enter conductive optics 8212 so as to by image orientation To the eyes 8222 of wearer, such as by lens 8214, reflecting surface 8220 is reflected by beam splitter 8218, passes through beam splitter 8218 etc. return to eyes 8222.Conductive optics 8212, but art technology are described while in accordance with 8214,8218 and 8220 Personnel are appreciated that conductive optics 8212 may include known any conductive optics configuration, including than being described herein as More complicated or simpler configuration.For example, beam splitter 8218 can be directly toward eye using the different focal length in field lens 8214 Eyeball bent image, it is thus eliminated that bending eyeglass 8220, realizes better simply design and realize.In embodiments, light source 8202 can be LED light source, laser light source, white light source etc. or any other light source known in the art.Optical coupling arrangement 8204 It can be the direct-coupling between light source 8202 and planar illumination tool 8208, or pass through couplant or mechanism, such as wave It leads, optical fiber, photoconductive tube, lens etc..Planar illumination tool 8208 can receive light, and passes through interference grating, the not perfect property of optics, dissipates Feature, reflecting surface, refracting element etc. are penetrated by the planar side of light-redirecting to its structure.Planar illumination tool 8208 can be instead Coverslip on penetrating property display 8210, such as reducing the combination of reflective display 8210 and planar illumination tool 8208 Thickness.Planar illumination tool 8208, which may also include, to be located at from the diffuser on the nearest side of conductive optics 8212, so as to The cone angle of enlarged image light when image light arrives conductive optics 8212 by planar illumination tool 8208.Conductive optics 8212 may include multiple optical elements, lens, eyeglass, beam splitter etc. or any other optic delivery known in the art Element.
Figure 83 presents the embodiment of the optical system 8302 for eyepiece 8300, wherein being mounted on flat on substrate 8304 Face irradiation tool 8310 and reflective display 8308 are illustrated as docking by conductive optics 8212, the conductive optics 8212 include initial dispersion lens 8312, beam splitter 8314 and spherical mirror 8318, and conductive optics 8212 are to eye movement range 8320 are presented image, and the eyes of wearer receive image at eye movement range 8320.In one example, flat beam splitter 8314 can To be the fractional transmission eyeglass coating etc. of wire-grid polarizer, metal, and spherical reflector 8318 can be a series of dielectric Coating to provide partially reflecting mirror on the surface.In another embodiment, the coating on spherical mirror 8318 can be thin metal and apply Layer is to provide partially transmitting mirror.
In an embodiment of optical system, Figure 84 shows a part as ferroelectricity light wave circuit (FLC) 8404 Planar illumination tool 8408, FLC8404 are coupled to planar illumination tool including the use of by waveguide wavelength converter 8420,8422 The configuration of 8408 laser light source 8402, wherein planar illumination tool 8408 is using grating technology to towards reflective display The incoming light at the edge from planar illumination tool is presented in 8410 plane surface.Image light from reflective display 8410 Then conductive optics are redirected back to by planar illumination tool 8408, via the hole 8412 in support construction 8414.Cause Laser is utilized for this embodiment, FLC also feds back through using light sharp according to broadening as described in United States Patent (USP) 7265896 Light spectrum reduces the spot from laser.In this embodiment, laser light source 8402 is IR laser light source, and wherein FLC will Light beam is combined to RGB, and having causes laser to jump and generate broadening bandwidth to provide the back-reflection of spot inhibition.In this reality It applies in example, spot inhibition carries out in waveguide 8420.Laser from laser light source 8402 passes through multi-mode interference combiner (MMI) 8422 are coupled to planar illumination tool 8408.Each laser light source port is positioned such that folded across the light of MMI combiner It is added on an output port of planar illumination tool 8408.The grating of planar illumination tool 8408 turns to reflective display production Raw uniform irradiation.In embodiments, superfine pitch (for example, interferometry) can be used to generate to reflectivity in optical grating element The irradiation of display, when light passes through planar illumination tool arrival conductive optics, the irradiation is reflected back toward and has pole from grating Low scattering.That is, light is spread out of in the case where alignment, so that grating is almost fully transparent.It is noted that in this embodiment Used in light feedback be due to the use to laser light source, and when leds are used, it may not be necessary to spot inhibits, because of LED There are enough bandwidth.
It includes that there is the configuration of the not perfect property of optics (to match in this case for " flute profile " that utilization is shown in Figure 85 Set) planar illumination tool 8502 optical system an embodiment.In this embodiment, light source 8202 is directly coupled 8204 arrive the edge of planar illumination tool 8502.Light then passes through planar illumination tool 8502 and encounters planar illumination tool materials In sulculus 8504A-D, the slot in such as polymethyl methacrylate (PMMA) piece.In embodiments, with slot 8504A-D Gradually leave input port, its spacing can be varied that (' aggressiveness ' more for example, with when they proceed to 8504D from 8504A Come it is smaller), its height is varied, its pitch is varied.Then light is redirected to reflective display by slot 8504A-D The 8210 irrelevant array as light source, so that the sector for advancing to the light of reflective display 8210 is generated, wherein reflecting Property display 8210 from slot 8504A-D enough far with generate from each slot, overlapping to provide to reflective display 8210 Region uniform irradiation irradiation mode.In further embodiments it is possible to there are the best spacing of slot, wherein reflexive display The number of the slot of every pixel can be increased so that light is more irrelevant (fuller) on device 8210, but this again because there is more slots There is provided image internal interference and produce in the image for being supplied to wearer compared with low contrast.Although reference groove describes this Embodiment, but the not perfect property of other optics (such as point) is also possible.
In embodiments, and Figure 86 is referred to, opposite direction protuberance 8604(is " opposing slot ") planar illumination work can be applied to In the slot of tool, such as in ' fastening ' protuberance sub-assembly 8602.Wherein opposite protuberance 8604 is placed in slot 8504A-D, so that There are air gaps between the side wall that the side wall and opposite direction for obtaining slot swell.This air gap provides such as when light is by planar illumination tool The circumscribed for the refractive index that light is perceived changes, the reflection that this facilitate light at groove sidewall.The application of opposite direction protuberance 8604 subtracts Aberration and the deflection of the image light as caused by slot are lacked.That is, the image light that self-reflection display 8210 reflects is rolled over by groove sidewall It penetrates, thus it changes direction due to snell law.By providing opposite protuberance, the Sidewall angles of bracket groove in slot Sidewall angles with opposite direction protuberance, the refraction of image light is compensated, and image light is reset towards conductive optics 8214 To.
In embodiments, and Figure 87 is referred to, planar illumination tool 8702 can be made of multiple laminate layers 8704 Laminar structure, wherein laminate layers 8704 have alternate different refractivity.For example, planar illumination tool 8702 can pass through lamination Two diagonal planes 8708 of thin slice.By this method, bathtub construction shown in Figure 85 and 86 is replaced with laminar structure 8702.Example Such as, (PMMA1 comparison PMMA2 --- wherein difference is the molecular weight of PMMA) can be made in laminated foil of similar material.Only It is relatively thicker for wanting each layer, it is possible to disturbing effect be not present, and be used as transparent plastic sheet.In shown configuration, to angleplied laminate The light source 8202 of small percentage is redirected to reflective display by pressure, wherein the pitch being laminated is selected to minimize aberration.
In an embodiment of optical system, Figure 88 shows the planar illumination tool 8802 of utilization ' wedge shape ' configuration.? In this embodiment, light source is directly coupled 8204 edges for arriving planar illumination tool 8802.Light then passes through planar illumination work Tool 8802 and the inclined surface for encountering the first wedge shape 8804, wherein light is redirected to reflective display 8210, is then return to Irradiation tool 8802 simultaneously passes through the first wedge shape 8804 and the second wedge shape 8812 and reaches on conductive optics.In addition, multilayer applies Layer 8808,8810 can be applied to wedge shape to improve conductive properties.In one example, wedge shape can be made of PMMA, having a size of 1/ 2mm high -10mm is wide, and crosses over entire reflective display, the angle etc. with 1 to 1.5 degree.In embodiments, light exists It before can be in wedge shape 8804 by multiple reflections to irradiate reflective display 8210 across wedge shape 8804.If using high reflection Coating 8808 and 8810 pair wedge shape 8804 applies, then light can be before turning to and transferring back to light source 8202 again in wedge shape Multiple reflections are carried out in 8804.However, by using laminated coating 8808 and 8810 in wedge shape 8804, such as with SiO2, five Two niobiums etc. are aoxidized, light may be oriented to irradiate reflective display 8210.Coating 8808 and 8810 is designed in wide in range model It encloses with specified wavelength reflected light in angle, but emits light in a certain angular range (for example, the outer angle of θ).In each embodiment In, which allows light in wedge-shaped internal reflection, until it reaches the transmission for being presented to reflective display 8210 Window, coating is then arranged to allow to transmit there.Light of the angle orientation from LED luminescent system of wedge shape is with equably Reflective image display is illuminated, to generate the image reflected by irradiation system.By providing the light from light source 8202 So that the wide cone angle of light enters wedge shape 8804, different light will be reached at the different location along the length of wedge shape 8804 and be transmitted Therefore window to be provided to the uniform irradiation on the surface of reflective display 8210, and is supplied to wearer's eyes Image has the uniform luminance as determined by the picture material in image.
In embodiments, the perspective including planar illumination tool 8208 and reflective display 8210 as described here Optical system can be applied to any headset equipment known in the art, such as including eyepiece as described herein, but also The helmet be can be applied to (for example, the military helmet, aircrew helmet, bicycle helmet, motorcycle helmet, the deep-sea helmet, the space helmet Deng), ski goggle, glasses, diving mask, night vision mask, anti-poison dust-proof mask, the Hazmat helmet, virtual implementing helmet, mould It proposes standby etc..In addition, optical system associated with headset equipment and protective cover can be incorporated to optical system in various manners, Including other than traditionally optical component associated with headset equipment and cover, optical system is inserted into headset equipment. For example, optical system can be included in ski goggle as individual unit, so that projected content is provided a user, But wherein optical system does not substitute any component of ski goggle, and the perspective cover of such as ski goggle is (for example, be exposed to outer Clear or coloring the plastic jacket of portion's environment, so that invasion of the eyes of user from wind and snow).Alternatively, optical system can be at least Partly substitute traditionally certain optical devices associated with wear-type device.For example, conductive optics 8212 is certain The outer lens of the alternative eyewear applications of optical element.In one example, the beam splitter of conductive optics 8212, lens or eyeglass The front lens that can replace eyewear applications (for example, sunglasses), it is thus eliminated that the needs of the front lens to glasses, such as if curved Bent reflecting optics 8220 are extended to cover glasses, then eliminate the demand to lens cover.In embodiments, including plane is shone The perspective optical system for penetrating tool 8208 and reflective display 8210 can be located in wear-type device, so as to wear-type device Function and it is beautiful for it is unobtrusive.For example, in the situation of glasses, or specifically in the situation of eyepiece, optical system The adjacent upper part of lens can be located at, such as in the top of frame.
In embodiments, optics assembly can be used for such as being mounted in the configuration of the display on head or on the helmet, And/or may also include single lens, binocular, holographic binocular, helmet visors, the head-mounted display with Mangin mirror, The integrated helmet and display sighting system, the display sighting system of the integrated helmet, the advanced head-mounted display (AHMD) of link and Multiple microdisplay optical devices.In embodiments, optics assembly may include telephoto lens.Such lens can be glasses Formula installation or install in other ways.Such a embodiment is beneficial for the people with the defects of vision.Each In embodiment, the wide-angle Kepler telescope of EliPeli can be built in spectacle lens.Delivery lens can be used in such design Interior embedded eyeglass folds optical path and source element for the amplification compared with high magnification numbe.This allows wearer in glasses lattice It is checked simultaneously in formula through amplification and the not enlarged visual field.In embodiments, optics assembly can be used for having by Britain's human relations In the configuration for the display that the Q-Sight of the BAE system house exploitation of Dun Shi is mounted on the helmet.Such configuration is provided to The new line for giving Situation Awareness looks out ability.Moreover, any optics group in configuration as described above can be used in each embodiment Piece installing.
The planar illumination tool of also referred to as irradiation module can provide the light of multicolour, including RGB (RGB) light and/ Or white light.Light from irradiation module is directed into 3LCD system, digital light processingSystem, liquid crystal over silicon (LCoS) system or other micro displays or micro- optical projection system.Irradiation module can be used wavelength combination and with to the non-linear of source High brightness, the long-life, spot reduces or the source of immaculate light to provide for the nonlinear frequency transformation of feedback.Of the invention is each Embodiment provides the light of multicolour, including RGB (RGB) light and/or white light.Light from irradiation module is directed into 3LCD system, digital light processingSystem, liquid crystal over silicon (LCoS) system or other micro displays or micro- optical projection system. Irradiation module described herein can use in the optics assembly of eyepiece 100.
One embodiment of the present of invention includes a system, which includes: the light beam for being configured to generate first wave length Laser, LED or other light sources;It is coupled to laser and is configured to guide the planar lightwave circuit of light beam;And it is coupled to Planar lightwave circuit and the light beam for being configured to receive first wave length, by the optical beam transformation of first wave length at the output of second wave length The waveguide optical frequency changer of light beam.The system can provide optical coupler feedback to laser, which depends non-linearly on First wave length light beam power.
Another embodiment of the present invention includes a system, which includes: substrate;It is placed on substrate and is configured to issue The light source (such as diode laser matrix or one or more LED) of multiple light beams of first wave length;Simultaneously quilt is placed on substrate It is coupled to light source, and is configured to combine multiple light beam and generates the planar lightwave circuit of the beam combination of first wave length;With And it is placed on substrate and is coupled to planar lightwave circuit, and be configured to first wave length using nonlinear frequency transformation Beam combination is transformed into the nonlinear optical element of the light beam of second wave length.The system can provide light to laser diode arrays Coupled Feedback, the feedback depend non-linearly on the power of the beam combination of first wave length.
Another embodiment of the present invention includes a system, which includes: the multiple light for being configured to generate first wave length The light source (such as semiconductor laser array or one or more LED) of beam;It is coupled to light source, and is configured to combine multiple Light beam and export first wave length beam combination array waveguide optical grating;It is coupled to array waveguide optical grating, and is configured to The quasi-phase matched wave of the output beam of second wave length is generated come the beam combination based on first wave length using second harmonic Long transformation waveguide.
Electric power can obtain out of wavelength shifting device and feed back to source.The electric power of feedback, which has, is supplied to wavelength change about source The non-linear dependence of the input electric power of exchange device.Nonlinear feedback can reduce the output power from wavelength shifting device to setting The susceptibility of variation in standby nonlinear factor, because if nonlinear factor reduces then reverse power and increases.It is increased anti- Feedback often increases the electric power for being supplied to wavelength shifting device, therefore alleviates the effect of the nonlinear factor of reduction.
With reference to Figure 109 A and 109B, processor 10902(is for example, digital signal processor) it can provide display order frame 10924 with for use by the display component 10928(of eyepiece 100 for example, LCOS display component) perform image display.In each implementation In example, sequence frames 10924 can be used or the intermediary component that is not employed as between processor 10902 and display component 10928 Display driver 10912 generates.For example, and reference Figure 109 A, processor 10902 may include frame buffer zone 10904 and display Interface 10908(is for example, Mobile Industry Processor Interface (MIPI), and display serial line interface (DSI)).Display interface 10908 It can will be supplied to by the RGB data 10910 of pixel as the intermediary component between processor 10902 and display component 10928 Display driver 10912, wherein display driver 10912 receives by pixel RGB data 10910 and generates for the independent of red Full frame shows data 10918, shows number for the independent full frame display data 10920 of green and the independent full frame for blue According to 10922, display order frame 10924 is thus supplied to display component 10928.In addition, display driver 10912 can be to display Component 10928 provides timing signal, all for example synchronous full frames 10918,10920,10922 as display order frame 10924 Transmitting.In another example, and Figure 109 B is referred to, display interface 10930 can be configured to by direct to display component 10928 It provides and shows data 10934 for red full frame, show data 10938 and the full frame for blue for green full frame Display data 10940 remove display driver 10912 as display order frame 10924.In addition, timing signal 10932 can be from Display interface 10930 is supplied directly to display component.This configuration can by remove demand to display driver provide it is aobvious Write lower power consumption.The demand to driver not only can be removed in this direct faceplate formation, but also the totality that can simplify configuration is patrolled Volume, and remove Pixel Information etc. is generated to faceplate formation of the change from pixel, from frame needed for redundant memory.
Figure 89 is the block diagram of the irradiation module of an embodiment according to the present invention.Irradiation module 8900 includes according to the present invention Light source, combiner and the optical frequency conversion device of one embodiment.Light source 8902,8904 issues the input terminal towards combiner 8906 The light radiation 8910,8914 of mouth 8922,8924.Combiner 8906 has combiner output port 8926, which issues combination Radiation 8918.Combined radiation 8918 is received by optical frequency conversion device 8908, and optical frequency converter provides output optics spoke Penetrate 8928.Optical frequency converter 8908 can also provide feedback radiation 8920 to combiner output port 8926.Combiner 8906 makes instead Feedback radiation 8920 separates to provide from the source that input port 8922 issues feedback radiation 8912 and issue from input port 8924 Source feedback radiation 8916.Source feedback radiation 8912 is received by light source 8902, and source feedback radiation 8916 is received by light source 8904.Light Light radiation 8910 and source feedback radiation 8912 between source 8902 and combiner 8906 can be according to free space and/or guide structures Any combination of (for example, optical fiber or any other optical waveguide) is propagated.Light radiation 8914, source feedback radiate 8916, combination Radiation 8918 and feedback radiation 8920 can also be propagated according to any combination of free space and/or guide structure.
Suitable light source 8902 and 8904 is including one or more LED or with the launch wavelength by light feedback influence Any optical emitter.The example in source includes laser, and can be semiconductor diode laser.For example, 8902 He of light source 8904 can be the element in semiconductor laser array.Source in addition to lasers can also be used (for example, optical frequency converter can It is used as source).Although showing two sources in Figure 89, more than two source is can also be used to realize in the present invention.Combiner 8906 It is shown generally as three port devices with port 8922,8924 and 8926.It is inputted although port 8922 and 8924 is referred to as Port, and port 8926 is referred to as combiner output port, these ports can be two-way, and can receive as described above With transmitting light radiation.
Combiner 8906 may include the waveguide dispersive elements and optical element for defining port.Suitable waveguide dispersive elements can Waveguide optical grating, reflexive diffraction grating, transmittance diffraction grating, holographic optical elements (HOE), wavelength selectivity mistake including array The assembly and photonic band gap structure of filter.Therefore, combiner 8906 can be wavelength combiner, and wherein input port is respectively With corresponding, non-overlap input port wave-length coverage, to be couple efficiently into combiner output port.
Each two-phonon process can carry out in optical frequency converter 8908, including but not limited to: harmonic wave occurs and the life that takes place frequently (SFG), second harmonic generation (SHG), difference frequency generation, parameter generation, parameter amplification, parametric oscillation, three wave mixing, four waves are mixed Frequently, stimulated Raman scattering, stimulated Brillouin scattering, stimulated emission, acousto-optic frequency shift and/or electric light frequency displacement.
Generally, optical frequency converter 8908 receives to input according to the optics of the input set of optical wavelength, and provides according to light The optics output of the output collection of wavelength is learned, wherein output collection is different from input set.
Optical frequency converter 8908 may include nonlinear optical material, such as lithium niobate, lithium tantalate, potassium titanium oxide phosphate, niobic acid Potassium, quartz, silicon, silicon oxynitride, GaAs, lithium borate and/or barium metaborate.Light interaction in optical frequency converter 8908 can be each Carried out in kind of structure, including body structure, waveguide, quantum well structure, quantum cable architecture, quantum-dot structure, photonic band gap structure and/ Or multicomponent waveguiding structure.
In the case where wherein optical frequency converter 8908 provides parameter nonlinear optical process, this nonlinear optical process Preferably phase matched.Such phase matched can be birefringent phase matching or quasi-phase matched.Quasi-phase matched can The disclosure of United States Patent (USP) 7 including authorizing Miller, the method disclosed in 116,468, the patent is contained in by reference This.
Optical frequency converter 8908 may also include the various elements for improving its operation, such as wavelength selectivity output coupling Wavelength selective reflectors, for wavelength selectivity resonance wavelength selective reflectors, and/or for controlling converter The wavelength selectivity of spectral response loses element.
In embodiments, multiple irradiation modules as described in Figure 89 can be associated to form compound irradiation module.
One component of irradiation module can be diffraction grating or grating as described further herein.Diffraction light grid Thickness is smaller than 1mm, but still firm enough to be permanently glued to position or replace the coverslip of LCOS.Make in irradiation module With one of grating the advantage is that it will increase efficiency using laser irradiation light source and reduce power.Grating can inherently have There is less stray light, and since wavestrip is relatively narrow, will be allowed for filtering out eyes in the lesser situation of reduction of perspective brightness Luminous more more options.
Figure 90 depicts the block diagram of the optical frequency converter of an embodiment according to the present invention.Figure 90 shows feedback radiation How 8920 provided by exemplary optical frequency converter 8908, and optical frequency converter 8908 provides parameter frequency transformation.Combined radiation 8918 provide the forward radiation 9002 for traveling to the right part of Figure 90 in optical frequency converter 8908, and also travel to the right part of Figure 90 Parametric radiation 9004 occur in optical frequency converter 8908 and from optical frequency converter 8908 be emitted as output light radiation 8928. In general, existing as interaction carries out (in this example, i.e., with radiation propagation to right part) from forward radiation 9002 to parametric radiation 9004 net power transmitting.There may be the reflector 9008 of transmittance relevant to wavelength to be placed in optical frequency converter 8908 In, with reflection (or part is reflected), forward radiation 9002 is in order to provide backward radiation 9006, or can be set to after end face 9010 In outside optical frequency converter 8908.Reflector 9008 can be grating, inner boundary, added with coating or plus the end face of coating or its Any combination.The preferred levels of the reflectivity of reflector 9008 are greater than 90%.Reflector at input interface 9012 mentions For purely linear feedback (that is, with the incoherent feedback of process efficiency).Reflector at end face 9010 provides maximum Nonlinear feedback, because forward power maximizes the correlation of process efficiency at output interface (takes the ginseng of phase matched Amount interaction).
Figure 91 is the block diagram of laser irradiation module according to an embodiment of the invention.Although in this embodiment using sharp Light device, it is to be understood that, it is possible to use other light sources, such as LED.Laser irradiation module 9100 includes diode laser array 9102, waveguide 9104 and 9106, star-type coupler 9108 and 9110 and optical frequency converter 9114.Diode laser array 9102 have the laser emitting elements for being coupled to waveguide 9104, and waveguide 9104 is taken on to the defeated of slab guide star-type coupler 9108 Inbound port (port 8922 and 8924 on such as Figure 89).Star-type coupler 9108 passes through 9106 coupling of waveguide with different length Close another slab guide star-type coupler 9110.The combination of star-type coupler 9108 and 9110 and waveguide 9106 can be array Change waveguide optical grating, and takes on and provide the wavelength combiner of combination radiation 8918 (for example, the combiner on Figure 89 to waveguide 9112 8906).Waveguide 9112 provides combination radiation 8918 to optical frequency converter 9114.In optical frequency converter 9114, can optionally it reflect The back reflection of the offer combination radiation 8918 of device 9116.Above in conjunction with as described in Figure 90, this back reflection each reality according to the present invention It applies example and nonlinear feedback is provided.Plane coating process and/or light can be used with reference to one or more of Figure 91 element described Carving method comes in common substrate to manufacture, to reduce cost, number of parts and to alignment request.
Second waveguide can be placed so that its core near the waveguide core in optical frequency converter 8908.Such as this field In it is known, the arrangement of this waveguide is used as directional coupler, so that the radiation in waveguide can provide in optical frequency converter 8908 Additional radiation.It can avoid significantly coupling by providing the wavelength radiation different from the wavelength of forward radiation 9002, or Spurious radiation can be coupled in optical frequency converter 8908 at the position that forward radiation 9002 is depleted.
Although the standing wave feedback configuration for the same paths backpropagation that wherein feedback power is advanced along input power is that have , but traveling wave feedback configuration can also be used.In traveling wave feedback configuration, feedback is different from the outgoing position of input power Gain media is reentered at one position.
Figure 92 is the block diagram of recombination laser irradiation module according to another embodiment of the present invention.Recombination laser irradiation module 9200 include one or more laser irradiation modules 9100 with reference to described in Figure 91.Although Figure 92 show for simplicity including The recombination laser irradiation module 9200 of three laser irradiation modules 9100, recombination laser irradiation module 9200 may include more or more Few laser irradiation module 9100.Diode laser array 9210 may include one or more diode laser arrays 9102, diode laser array can be the array of laser diode, diode laser array ,/or be configured to be emitted The semiconductor laser array of the light radiation of (that is, wavelength is shorter than radio wave but longer than visible light) in infrared spectroscopy.
Laser array output waveguide 9220 is coupled to the diode laser in diode laser array 9210, and will The output directional of diode laser array 9210 is to star-type coupler 9108A-C.Laser array output waveguide 9220, array Change waveguide optical grating 9230 and planar lightwave circuit can be used to manufacture on a single substrate for optical frequency converter 9114A-C, and can wrap Include silicon oxynitride waveguide and/or lithium tantalate waveguide.
Array waveguide optical grating 9230 includes star-type coupler 9108A-C, waveguide 9106A-C and star-type coupler 9110A-C.Waveguide 9112A-C provides combined radiation to optical frequency converter 9114A-C respectively, to star-type coupler 9110A-C Feedback radiation is provided.
Optical frequency converter 9114A-C may include nonlinear optics (NLO) element, for example, optical parametric oscillator element and/or Quasi-phase matched optical element.
Recombination laser irradiation module 9200 can produce the output light radiation of multiple wavelength.This multiple wavelength can be at visible light In spectrum, i.e., wavelength is shorter than infrared light but longer than ultraviolet light.For example, about 450nm and about 470nm can be similarly provided in waveguide 9240A Between output light radiation, waveguide 9240B can provide the output light radiation between about 525nm and about 545nm, and waveguide 9240C It can provide the output light radiation between about 615nm and about 660nm.These ranges of output light radiation can be selected again to provide The visible wavelength (for example, respectively blue, green and red wavelength) of human viewers' pleasure is enabled, and can be combined again to generate white light Output.
Waveguide 9240A-C can with laser array output waveguide 9220, array waveguide optical grating 9230 and optical frequency translation It is manufactured on the identical planar lightwave circuit of device 9114A-C.In some embodiments, the output light that waveguide 9240A-C is respectively provided Radiation can provide optical power of the range in the range between about 1 watt and about 20 watts.
Optical frequency converter 9114 may include being configured to execute second harmonic to the combination radiation of first wave length to generate (SHG) And generate the quasi-phase matched wavelength conversion waveguide of the radiation of second wave length.Quasi-phase matched wavelength conversion waveguide can be configured to Carry out the pumped optical parametric oscillator being integrated into quasi-phase matched wavelength conversion waveguide using the radiation of second wave length and generates third The radiation of wavelength, third wavelength are optionally different from second wave length.The waveguide of quasi-phase matched wavelength conversion also can produce via Waveguide 9112 travels to the feedback radiation of diode laser array 9210 by array waveguide optical grating 9230, so that setting It can be determined by the corresponding port on array waveguide optical grating in each laser in diode laser array 9210 Unique wavelength work.
For example, recombination laser irradiation module 9200 can be configured to using with the diode of the wavelength nominal operation of about 830nm Laser array 9210, to generate the output light spoke in visible spectrum corresponding with any color in red, green or blue It penetrates.
Recombination laser irradiation module 9200 can be optionally configured in the case where no optical device intervened therebetween Direct irradiation spatial light modulator.In some embodiments, recombination laser irradiation module 9200 can be used with single first wave length The diode laser array 9210 of nominal operation, with generate simultaneously multiple second wave lengths (such as with red, green or blue it is right The wavelength answered) output light radiation.Each different second wave length can be generated by the example of laser irradiation module 9100.
Recombination laser irradiation module 9200 can be configured to will be more by using waveguide selectivity tap (not shown) The output light radiating composite of a second wave length generates the white light of diffraction limit into single waveguide.
Diode laser array 9210, laser array output waveguide 9220, array waveguide optical grating 9230, waveguide 9112, the manufacture crafts such as coating or photoetching can be used to exist for optical frequency converter 9114 and frequency changer output waveguide 9240 It is manufactured in common substrate.With reference to described in Figure 92, beam shaping element 9250 is coupled to recombination laser by waveguide 9240A-C Irradiation module 9200.
Beam shaping element 9250 can be placed on substrate identical with recombination laser irradiation module 9200.Substrate for example may be used Including Heat Conduction Material, semiconductor material or ceramic material.Substrate may include copper-tungsten, silicon, GaAs, lithium tantalate, oxynitriding Silicon and/or gallium nitride, and can be used including coating, photoetching, etching, deposition and the semiconductor fabrication process of injection and handle.
In the element certain elements (such as diode laser array 9210, laser array output waveguide 9220, Array waveguide optical grating 9230, waveguide 9112, optical frequency converter 9114, waveguide 9240, beam shaping element 9250 and various Relevant planar lightwave circuit) it can be by passive coupling and/or alignment, and in certain embodiments, on a common substrate by height Packaging passive alignment.Each of waveguide 9240A-C can be coupled to the different instances of beam shaping element 9250, rather than as schemed institute It is coupled to discrete component with showing.
The output light radiation that beam shaping element 9250 can be configured to self-waveguide 9240A-C in future is configured to substantially rectangular Diffraction limited beam, and can also future self-waveguide 9240A-C output light radiation be configured in substantially rectangular beam shape It is upper that there is the brightness uniformity for being greater than about 95%.
Beam shaping element 9250 may include non-spherical lens, such as " high cap " (top-hat) lenticule, holographic element or Grating.In some embodiments, the diffraction limited beam that beam shaping element 9250 exports generates the spot of considerably reduction Point is speckless.The light beam that beam shaping element 9250 exports can provide optics of the range between about 1 watt and about 20 watts Power, and there is substantially flat phase front.
Figure 93 is the block diagram of the imaging system of an embodiment according to the present invention.Imaging system 9300 includes light engine 9310, light beam 9320, spatial light modulator 9330, modulated light beam 9340 and projecting lens 9350.Light engine 9310 can be Compound light irradiation module, multiple irradiation modules described in such as Figure 89, the recombination laser irradiation module with reference to described in Figure 92 9200, or the LASER Illuminator System 9300 with reference to described in Figure 93.Spatial light modulator 930 can be 3LCD system, DLP system, LCoS system, transmissive type liquid crystal display (for example, transmission-type LCoS), liquid crystal over silicon array, the light valve based on grating or other Micro-display or micro- optical projection system or reflective display.
Spatial light modulator 9330 can be configured to spatially be modulated light beam 9320.Spatial light modulator 9330 It can be coupled to electronic circuit, which is configured to make spatial light modulator 9330 (such as, can be electric by video image The video image shown depending on machine or computer monitor) it is modulated on light beam 9320 to generate modulated light beam 9340.One In a little embodiments, using optical reflection principle, modulated light beam 9340 can receive the same of light beam 9320 from spatial light modulator Side is exported from spatial light modulator.In other embodiments, using optical transmission principle, modulated light beam 9340 can be from space The opposite side that optical modulator receives light beam 9320 is exported from spatial light modulator.Modulated light beam 9340 may be optionally coupled to throwing In shadow lens 9350.Projecting lens 9350 is commonly configured to modulated light beam 9340 projecting to such as video display screen On display.
The method of irradiation video display can be used compound irradiation module (such as including the compound of multiple irradiation modules 8900 Irradiation module), recombination laser irradiation module 9100, LASER Illuminator System 9200 or imaging system 9300 execute.Diffraction limit Output beam use compound irradiation module, recombination laser irradiation module 9100, LASER Illuminator System 9200 or light engine 9310 To generate.Output beam use space optical modulator (such as spatial light modulator 9330) and optional projecting lens 9350 To orient.Spatial light modulator can project image on the display of such as video display screen.
Irradiation module can be configured to be emitted any number of wavelength, including one, two, three, four, five, six A or more, wavelength is spaced apart according to different quantity, and has equal or unequal power level.Irradiation module can be matched It is set to the single wavelength of every beam exit or the multiple wavelength of every beam exit.Irradiation module may also include additional component and function, Including Polarization Controller, polarization rotator, power supply, the power circuit of such as power fet, electronic control circuit, heat management system, Heat pipe and safety interlocking.In some embodiments, irradiation module can be coupled to optical fiber or light guide, such as glass (for example, BK7).
Some selections of LCoS headlamp designs include: wedge shape 1) with laminated coating (MLC).This conception of species uses MLC To define the specific angle reflected and transmit;2) with the wedge shape of polarization beam apparatus coating.This conception of species is as vertical in conventional PBS It square but is worked with much narrow angle.This can be PBS coating or wiregrating film;3) (these are similar to choosing to PBS prism bar #2 is selected, but there is the seam down to face plate center;4) wire-grid polarizer piece beam splitter (be similar to PBS wedge shape, but be only piece, So that it is mainly air rather than solid glass);And 5) include flexible membrane polarization beam apparatus (PBS), such as by having Customize refractive index different plastics each layer of alternating (so that its only in a face direction rather than matched in direction in another side) Manufactured 3M polarization beam apparatus.In mismatching direction, the quarter-wave stack of high reflector is formd, and in match party Xiang Zhong, film take on transparent plastic plate.The film may be laminated between glass prism, to be formed as fast on entire visual range Fast light beam provides high performance wide-angle PBS.MLC wedge shape can be it is firm, and can be by gluing securely in place, without air gap It can cold or thermal deflection.It can be used together with broadband LED light source.In embodiments, for complete module, MLC wedge shape The coverslip of alternative LCOS.MLC wedge shape, which can be, is approximately less than 4mm thickness.In one embodiment, MLC wedge shape can be 2mm it is thick or It is thinner.
It is appreciated that the portion the present invention provides front light system as described here in all types of optical arrangements Administration, all types of optical arrangements may include but not necessarily include augmented reality eyepiece.Front light system is used as any class Component in the optical system of type is especially preferred for use in as the source directly or indirectly irradiated to any one or more of optics The irradiation of element, optical surface or optical sensor most preferably has those of the optical path of alternative configuration component, all Such as LCoS or liquid crystal display and/or reflected light.In some embodiments, at least some of the light that front light system generates can It is reflected, to pass through one of front light system in the way that light reaches its final destination (for example, eyes, optical sensor etc.) Divide and is transmitted back to, and in other embodiments, generated light is in the way that it reaches its final destination not over headlight system System is passed back.For example, front light system can irradiate the optical device of such as LCoS, to obtain image light, image light can pass through headlight system The component orientation of system is returned, later by adjusting image light so as to finally by the received additional light of one or more of eyes of user System.Such other optical systems can be or may include: that waveguide (can be the wave of free form surface in its component Lead), beam splitter, collimator, polarizer, eyeglass, lens and diffraction grating.
Figure 95 depicts the embodiment of LCoS headlamp designs.In this embodiment, the light irradiation from RGB LED9508 Headlight 9504, the headlight can be wedge shape, PBS etc..Light is hit polarizer 9510 and is launched into its S state LCoS9502, it is reflected back toward with its P-state by aspherical 9512 as image light there.Inline polarizer 9514 can The 1/2 wave rotation to S state is polarized and/or caused to image light again.Then image light hits wire-grid polarizer 9520, And it is reflected into bending (spherical surface) part mirror 9524, pass through 1/2 wave delayer 9522 on the way at it.Image light is from lens reflecting to use The eyes 9518 at family pass through 1/2 wave delayer 9522 and wire-grid polarizer 9520 again.The each of headlight 9504 will be discussed now A example.
In embodiments, optics assembly includes partial reflection, partial transmission optical element, these optical elements It reflects the corresponding portion of the image light from image source and transmits the scene light of the see-through view from ambient enviroment, so as to The eyes of user provide the combination image being made of each section of the image light reflected and the scene light transmitted.
In portable display system, it is important to provide bright, compact and light weight display.Portable display system Including mobile phone, laptop computer, tablet computer and head-mounted display.
The present invention provides compact and light weight the headlights for portable display system, and headlight is by as part reflector Bending or other non-planar wire-grid polarizer films composition, polarizer film efficiently make the deflection of the light from edge light to irradiate Reflective image source.Known wire-grid polarizer provides the high-efficiency reflective to a polarization state and allows other polarization states simultaneously Pass through.Although sheet glass wire-grid polarizer is well known in the industry, and can be used the wire-grid polarizer of rigidity in the present invention, But flexible wire-grid polarizer film in a preferred embodiment of the invention, is used as bending wire-grid polarizer.Suitable wire grid polarization Device film can be obtained from the Asahi-Kasei E-materials company in Tokyo city.
Edge-light provides the illumination of compact form for display, but since it is located at the edge of image source, light must be inclined It turn 90 degrees with irradiation image source.In one embodiment of this invention, bending wire-grid polarizer film is used as partial reflection surface, So that being deflected down by the light that edge light provides to irradiate reflective image source.It is adjacent to provide polarization with edge light Device, so as to be provided to the irradiation light polarization of bending wire-grid polarizer.Polarizer and wire-grid polarizer, which are oriented such that, to be passed through The light of polarizer is reflected by wire-grid polarizer.Since the quarter-wave being included in reflective image source delays film, institute is anti- The polarization of image light is penetrated compared with irradiation light, is opposite polarization state.Therefore, institute's reflected image light passes through wire-grid polarizer film And continue to display optics.By using flexible wire-grid polarizer film as part reflector, partial reflection surface It can be bent in lightweight structure, wire-grid polarizer is taken on as the reflector for irradiation light and for image in this configuration The dual role of the transparent component of light.Wire-grid polarizer film provides the advantage of it and can receive in a wide range of incidence angle Image light, so that curve will not interfere with the image light by reaching display optics.Further, since wire-grid polarizer film is Thin (for example, less than 200 microns), curved shape will not make significantly image when image light is by reaching display optics Light distortion.Finally, the tendentiousness that wire-grid polarizer enables light scatter is very low, so hi-vision contrast is maintained.
Figure 136 shows the schematic diagram of the image source 13600 of headlight of the invention.Edge light 13602 is provided by inclined Shake the irradiation light of device 13614 so that irradiation light 13610 is polarized, wherein polarizer 13614 can be absorption polarizers or Reflective polarizer.Polarizer is oriented such that the polarizer state of irradiation light 13610 is so: light is inclined by bending wiregrating The device 13608 that shakes reflects, and deflects down irradiation light 13610 towards reflective image source 13604.Therefore, polarizer 13614 Pass through axis perpendicular to wire-grid polarizer 13608 by axis.It will be appreciated by the appropriately skilled person that although Figure 136 is shown What the image source 13600 of headlight was a horizontally oriented, but other directions are equally probable.As already described, usually all If the reflective image source of LCOS image source includes that quarter-wave delays film, so that the polarization state of irradiation light is being reflected Property image source reflection during change, and as a result, image light have polarization state generally opposite compared with irradiation light. As known to those skilled in the art and as described in United States Patent (USP) 4398805, this change in polarization state is to all bases In the operation of the display of liquid crystal be basic.To the various pieces of image, the liquid crystal cell in reflective image source 13604 will draw The more or less change of polarizing state, so that the image light 13612 reflected has before through bending wire-grid polarizer There is the elliptical polarization state of mixing.When by being bent wire-grid polarizer 13608 and appointing in display optics can be included in After what additional polariser, the polarization state of image light 13612 is determined by bending wire-grid polarizer 13608, and image light 13612 The picture material for inside including determines local strength of the image light 13612 in the image shown by portable display system.
The flexible nature of the wire-grid polarizer film used in bending wire-grid polarizer 13608, which allows it to be shaped as, to shine Penetrate the shape that light 13610 focuses on reflective image source 13604.The shape for being bent the curve of wire-grid polarizer is selected as mentioning For the uniform irradiation in reflective image source.Figure 136 shows the bending wire-grid polarizer 13608 with parabolic shape, but radiates Curve, complicated spline curve or plane depend on the property of edge light 13602 it is also possible that the uniformly deflection of irradiation light 13610 Onto reflective image source 13604.Experiment shows that parabola, radiation and complicated spline curve compared with flat surfaces, are provided which Irradiation more evenly.But in some very thin headlight image sources, can be efficiently used flat wire-grid polarizer film with The portable display system of light weight is provided.The shape of flexible wire-grid polarizer film can be maintained with side frame, the side as shown in Figure 138 Frame has the shape slot of suitable profile wire-grid polarizer to be held in position in, and Figure 138 shows headlight image source group The schematic diagram of piece installing 13800.Side frame 13802 is shown with bent groove 13804 and is maintained at for flexible wire-grid polarizer film In required curved shape.Although only showing a side frame 13802 in Figure 138, two side frames 13802 and preceding can be used Other components of lamp image source carry out the support bends shape on either side.In any situation, because of headlight of the invention The major part of image source is made of air and wire-grid polarizer film is very thin, so compared with the front light system of the prior art, weight Amount wants light.
In another embodiment of the present invention, headlight image source 13700 is provided, it has along reflective image source Two or more edge lights 13702 of 13604 two or more edges placement.Adjoin with each edge light 13702 Polarizer 13712 is provided adjacently so that irradiation light 13708 polarizes.Irradiation light 13708 by bending wire-grid polarizer 13704 deflect with Irradiate reflective image source 13604.Then the image light 13710 reflected passes through bending wire-grid polarizer 13704 and reaches aobvious Show on optical device.It is that more light can be applied to reflectivity using the advantages of two or more edge light 13702 Thus image source 13604 provides brighter image.
Edge light can be fluorescent lamp, incandescent lamp, Organic Light Emitting Diode, laser or electroluminescent lamp.In this hair In bright preferred embodiment, edge light is the array of 3 or more light emitting diodes.For uniform irradiation reflective image Source, edge light should have sizable cone angle, such as edge light can be Lambertian source.The case where to laser light source, light Cone angle needs expanded.Array or multiple edge lights by using light source, the distribution of light to reflective image source can quilt Adjustment as a result, may make the brightness of shown image more evenly to provide irradiation more evenly.
Image light provided by headlight image source of the invention is passed to the display optics of portable display system In.How image depending on display is used, and various display optics are possible.For example, when display is plane screen When curtain display, display optics can be dispersion, or alternatively when display is that near-eye display or wear-type are aobvious When showing device, display optics can be refraction or diffraction.
Figure 139 is the flow chart in the present invention for the method for the portable display system with reflective image source.? In step 13900, the irradiation light of polarization is provided to one or more edges in reflective image source.In step 13902, bending Wire-grid polarizer receives irradiation light, and makes its deflection to irradiate reflective image source, and wherein the curve of wire-grid polarizer is chosen as Improve the uniformity of the irradiation to the region in reflective image source.In step 13904, reflective image source receives irradiation light, Reflected illumination light and the polarization state for corresponding to the image modification irradiation light shown at the same time.Image light is then in step 13908 by being bent wire-grid polarizer and being passed in display optics.In step 13910, image is by portable display System is shown.
In embodiments, the light weight portable display system for displaying images with reflective lcd image source can Including one or more edge lights (to adjoin one or more edges of reflective lcd image source polarized irradiation light to be provided), Be bent wire-grid polarizer part reflector (can receive polarized irradiation light, and make its deflection to irradiate reflective lcd image source) with And display optics (receive reflected image light from reflective lcd image source and show image).In addition, one or more A edge light may include light emitting diode.In embodiments, wire-grid polarizer can be flexible membrane, and flexible membrane can be by side frame It is maintained in curved shape.In embodiments, the bending wire-grid polarizer of display system can be parabola, radiation or complexity Spline curve.In addition, the reflective lcd image source of display system can be LCOS.In embodiments, display system Display optics may include diffuser, and display system can be flat screen display.In embodiments, display system Display optics may include refraction or diffraction element, and display system can be near-eye display or head-mounted display.
In embodiments, for providing image on the light weight portable display system with reflective lcd image source Method can include: to one or more edges of reflective lcd image source provide polarized irradiation light, to be bent wire grid polarization Device receives irradiation light and deflects light to irradiate reflective lcd image source, reflects and relative to reflective lcd image to be used The polarization state for the image modification irradiation light that source is shown makes image light by bending wire-grid polarizer to provide image light, with aobvious Show that optical device receives image light, and display image.In each embodiment of this method, it is bent the Curved of wire-grid polarizer Shape can be chosen so as to improve the uniformity of the irradiation of reflective lcd image source.In addition, one or more edge lights may include Light emitting diode.In embodiments, wire-grid polarizer can be flexible membrane.In addition, flexible membrane can be maintained at bending by side frame In shape.In the various embodiments of the invention, bending wire-grid polarizer can be parabola, radiation or complicated spline curve. In addition, reflective lcd image source can be LCOS in each embodiment of above method.In embodiments, optics device is shown Part may include diffuser, and display system can be flat screen display.In each embodiment of above method, optics is shown Device may include refraction or diffraction element, and display system can be near-eye display or head-mounted display.
Figure 96 depicts an embodiment of the headlight 9504 including the optical bonding prism with polarizer.Prism is revealed as Two cuboids have substantial transparent interface 9602 therebetween.Each prism is to angle bisection, along the interface placement divided equally Polarizing coating 9604.The part of dividing equally of cuboid is formed by the triangle of lower position and is optionally made into monolithic 9608.Prism can be made of BK-7 or equivalent.In this embodiment, cuboid, which has, is measured as the square end that 2mm multiplies 2mm. The length of cuboid is 10mm in this embodiment.In an alternate embodiment, divide equally including 50% eyeglass, 9704 surface, and two Interface between a cuboid includes the polarizer 9702 that light can be transmitted according to P-state.
Figure 98 depicts three versions of LCoS headlamp designs.Figure 98 A depicts the wedge shape with laminated coating (MLC). This conception of species defines the specific angle reflected and transmit using MLC.In this embodiment, any in P or S-polarization state The image light of state is visually observed by user's.Figure 98 B depicts the PBS with polarizer coating.Herein, only to user's The image light of eyes transmitting S-polarization.Figure 98 C depicts right-angle prism, and eliminating allows image light by air as S polarized light The many materials of the prism of transmitting.
Figure 99 depicts the wedge shape with the polarizing coating 9902 being stacked on LCoS9904 plus PBS.
Figure 100 depicts light and enters short end (A) and light along two embodiments of long end (B) prism entered.Scheming In 100A, the offset by dividing cuboid equally forms wedge shape, to form at least one 8.6 degree angle dividing interface equally.At this In embodiment, offset divide equally RGB LED10002 by its emit light side on obtain 0.5mm high a part and Another part of 1.5mm high.Along place is divided equally, polarizing coating 10004 has been disposed.In Figure 100 B, by the offset for dividing cuboid equally Amount forms wedge shape, to form at least one 14.3 degree angle dividing interface equally.In this embodiment, offset is divided equally in RGB 0 obtains a part of .5mm high and another part of 1.5mm high on the side that LED10008 passes through its transmitting light.Edge is divided equally Place, has disposed polarizing coating 10010.
Figure 101 is depicted by the bending PBS film 10104 for the RGB LED10102 irradiation being placed on LCoS chip 10108. Rgb light from LED array 10102 is reflected on the surface 10108 of LCOS chip by PBS film 10104, but is enabled from imaging core The light of piece reflection passes unopposed through the eyes for reaching optics assembly and eventually arriving at user.The film used in this system Including Asahi film, this is triacetate cellulose or cellulose acetate substrate (TAC).In embodiments, film can have The UV embossment ripple of 100nm, and the press polish coating on ridge is constructed, which can be directed to the incidence angle of light and form angle. Asahi film can by 20cm wide multiplied by 30 meters long involve in into, and when in LCD irradiates use when have BEF property.Asahi film It can support the wavelength from visible light to IR, and can all keep stablizing until 100 DEG C.
In another embodiment, Figure 21 and 22 depicts waveguide and the replacement arrangement of projector with decomposition view.At this In arrangement, projector is placed after abutting against the hinge of eyepiece arm, and it be oriented vertically to so that RGB LED signal just Begin into be it is vertical, until direction by reflexive prism change with enter waveguide lens.The projection engine of vertical arrangement can be Center is with PBS218, in hollow and taper tunnel of the bottom with RGB LED array, with film diffuser with secondary colour Coloured silk is so as to collection and condenser lens in optical device.PBS can have Prepolarization device on the plane of incidence.Prepolarization device can It is aligned to transmit the light (such as p-polarization light) of a certain polarization and reflects light (such as s polarization of (or absorption) opposite polarization Light).Then polarised light can reach field lens 216 by PBS.The purpose of field lens 216 can be foundation to the close of LCoS panel Telecentric iris irradiation.LCoS display can be real reflectivity, color be reflected with correct temporal order, so that image quilt Correct display.Light can be reflected from LCoS panel, and for the bright areas of image, and light can be rotated to be s polarization.Light is then logical The refraction of field lens 216 is crossed, and can be reflected in the inner boundary of PBS, and leave projector, towards coupled lens.It is hollow, taper Tunnel 220 can replace the lenslet that homogenizes from other embodiments.In being placed in by vertically oriented projector and by PBS Centre, space is conserved, and projector can be placed in hinge space, can be hung up from waveguide almost without the arm of moment of flexure.
It can be passed in environment outward from the light of image source or the reflection of the associated optics of eyepiece or scattering.These light losses It loses and " eyes shine " or " nightglow " is perceived as by outside beholder, wherein when being checked in rather dark environment, lens Region around each section or eyepiece is revealed as shining.In certain situations that eyes as shown in FIG. 22 A shine, when by outer For portion viewer when outside is checked, shown image is seen as the observable image 2202A in display area.In order to according to Keep the privacy of image just checked and according to make as user in rather dark environment using when eyepiece less The privacy of the viewing experience of user is kept in terms of noticeable two, preferably reduction eyes shine.Each method and device Eyes can be reduced by light control element to shine, and reflected such as in optical device associated with image light source using part Property eyeglass, using polarization optics etc..For example, the light into waveguide can be polarized, such as s polarization.Light control element can wrap Include linear polarization.Wherein the linear polarization in light control element be oriented relative to the image light of linear polarization so that The second part for obtaining passing through partially reflective property eyeglass in the image light of linear polarization is blocked and eyes shine and are reduced.In each reality It applies in example, eyes, which shine, relatively to be polarized from the light of the eye reflections of user (such as, in this instance by being attached to lens For p-polarization) waveguide or frame (all fastening optical devices as described here) minimize or eliminate.
In embodiments, light control element may include the second quarter-wave film and linear polarization.Two or four/ The second part of the image light of circular polarization is transformed into the image light of linear polarization by one wave film, and the image light of linear polarization has quilt The polarization state that linear polarization in light control element stops, is reduced so that eyes shine.For example, working as light control element Including linear polarization and when quarter-wave film, the incoming unpolarized scene light of the external environment before user It is transformed into linearly polarized photon, and 50% light is blocked.First part in scene light by linear polarization is linear inclined Shake light, these light are transformed into circularly polarized light by quarter-wave film.From the third portion of partial reflection lens reflecting in scene light Dividing has reversed circular polarization state, then these light convert linear polarized light by the second quarter-wave film.Linear polarization Then device stops the Part III reflected in scene light, thus reduce the light of escape and reduce eyes shining.Figure 22 B is shown One example of the perspective display assembly in spectacle-frame with light control element.Glasses cross section 2200B shows spectacle-frame 2202B In perspective display assembly each component.The entire see-through view that light control element covering user is seen.In eyes of user Visual field 2214B in, supporting member 2204B and 2208B are illustrated as support section reflectivity eyeglass 2210B and beam splitter layer respectively 2212B.Supporting member 2204B and 2208B and light control element 2218B are connected to spectacle-frame 2202B.Such as fold eyeglass Other components of 2220B and the first quarter-wave film 2222B are also connected to supporting member 2204B and 2208B, so that group The assembly of conjunction is firm in structure.
The scattering of stray light in the compact optical of such as head-mounted display usually from shell or other structures Side wall, light encounters surface with precipitous angle there.Such stray light generates the scattering light for surrounding displayed image Bright areas.
There are two types of methods to reduce such stray light.One is keep side wall or other structures dimmed or roughening next Reduce the reflectivity of light.However, although this increases the absorptivity at surface really, it still can quilt from the reflected light of surface scattering It notices.Another method is to provide baffle to stop or trim stray light.Stop or trims very big from the reflected light of surface scattering Ground reduces the effect of this stray light.In head-mounted display, it is beneficial to reduce stray light using both methods, because The bright areas of shown image peripheral is eliminated, and the contrast of shown image is increased.
United States Patent (USP) 5949583 provide the observation window at the top of head-mounted display with stop stray light from top into Enter.However, this does not solve the demand to the control for reducing the stray light inside wear-type display system.
United States Patent (USP) 6369952 provides two covers to stop the liquid crystal display image source in head-mounted display Perimeter light.First cover is located on the input side in liquid crystal image source, adjoins backlight, and the second cover is located at liquid crystal Show on the outlet side of device.Since the two covers are respectively positioned near liquid crystal display, " the first cover 222 and the second cover 224 divide Not Ju You aperture or window 232,234, these apertures or window are substantially equal and congruent with the zone of action of LCD " the (the 15th Column, 15-19 row).By being positioned about cover in image source, cover can to it is being emitted from image source, in image source from figure Light in the big cone angle in the closer each region in the center of the zone of action of image source has the function of very little.This big cone angle light can be pressed It is reflected according to various modes from the side wall of shell, thus constitutes the stray light of bright areas form and lead to reduced contrast.
To which there are still the demands to the method for the stray light in each source inside head-mounted display is reduced.
Figure 160 shows the example of the display system with optically flat reflective surface, the optically flat reflective surface It is the beam splitter being made of the optical film on substrate, wherein display system is near-eye display 16002.In this example, image Source 16012 includes optical projection system (not shown) using the light for including the folding optical axis 16018 being located in near-eye display 16002 It learns layout and image light is provided.Optical device along optical axis 16018 may include focusedimage light to provide to the eyes 16004 of user The lens of focusedimage from image source 16012.Optical axis 16018 is folded into spherical surface from image source 16012 by beam splitter 16008 Or spherical reflector 16010.Beam splitter 16008 can be partial reflection eyeglass or polarization beam apparatus.Near-eye display Beam splitter 16008 in 16002 be directed to an angle with will at least part from the image light of image source 16012 again It is directed to reflector 16010.From reflector 16010, at least another part of image light is reflected back toward the eyes 16004 of user. Another part of the image light reflected is passed back by beam splitter 16008 and focuses on the eyes 16004 of user.Reflector 16010 can be eyeglass or part lens.In the case where reflector 16010 is part lens, near-eye display 16002 is come from Before the scene light of scene can be combined with image light, thus present to the eyes of user 16004 by the image along axis 16018 The combination image light 16020 that light and scene light 16014 are constituted.Combined image light 16020 is presented to the eyes 16004 of user The combination image of scene and the covering image from image source.
Figure 161 shows the diagram of nearly eye display module 200.Module 200 is by reflector 16104, image source module 16108 It is constituted with beam splitter 16102.Module can be in side opening, wherein being located at reflector 16104,16108 and of image source module Have between at least some of connection edge between beam splitter 16102 attached.Alternatively, module 200 can be by side wall in side It closes, to provide the closed module for the inner surface for preventing dust, dirt and water contact module 200.Reflector 16104, image source Module 16108 and beam splitter 16102 can be manufactured separately and then be attached at together or wherein at least some can be in connection It is fabricated together in subassemblies.In module 200, optical film be can be used on beam splitter 16102 or reflector 16104.? In Figure 161, beam splitter 16102 is illustrated as flat surfaces, and reflector 16104 is illustrated as spherical surface.In nearly eye display module 200 In, both reflector 16104 and beam splitter 16102 can be used to provide image to the eyes of user, as shown in Figure 160, Therefore surface is optically flat or optical-quality homogeneous is important.
It is assumed that image source 16108 includes optical projection system, optical projection system has the light source of big cone angle light, then image light also has Big cone angle.As a result, image light is interacted with the side wall of module 200, this interaction can provide bright areas form and (be seen by user Examine as the bright areas of shown image peripheral) reflection and scattering light.These bright areas are made us very much for a user Divert one's attention, because they may appear as the halation of shown image peripheral.In addition, scattering light may be by shown Image on provide low-light randomly to make the contrast in image degrade.
Figure 162 shows the diagram of optical device associated with the type of head-mounted display 16200.In optical device In, light source 16204 provides the big cone angle light including central ray 16202 and rim ray 16224.Light source 16204 can provide Polarised light.Light passes to illumination beam splitter device 16210 from light source 16204, and the beam splitter is towards 16208 reflected light of reflective image source A part, which can be LCOS display.The first part of light by image source 16208 reflection and simultaneously with The picture material being shown accordingly changes polarization state.Then the second part of light passes through illumination beam splitter device 16210, so Pass through one or more lens 16212 of the cone angle of expansion light afterwards.The Part III of light is with an angle by imaging beamsplitter 16220 reflect towards spherical surface (or aspherical) part lens 16214.The Part IV of 16214 reflected light of part lens, draws simultaneously Play the eyes 16228 that light converges and image is focused on to user.After the Part IV of light is reflected by part lens 16214, The Part V of light passes through imaging beamsplitter 16220 and passes on the eyes 16228 of user, and image source 16208 is shown there The amplified version of the image shown is provided to the eyes 16228 of user.In perspective head-mounted display, the light from environment 16218(or scene light) pass through part lens 16214 and imaging beamsplitter 16220 to provide the fluoroscopy images of environment.Then to User provides the combination image being made of the fluoroscopy images of displayed image and environment from image source.
Central ray 16202 is along the center that the optical axis of the optical device of head-mounted display passes through optical device.Optics device Part includes: illumination beam splitter device 16210, image source 16208, lens 16212, imaging beamsplitter 16220 and part lens 16214. Rim ray 16224 is transmitted along the side of shell 16222, and wherein light can be interacted with the side wall of shell 16222, wherein edge-light Line 16224 can be reflected or be scattered by side wall, as shown in Figure 162.From this reflection of rim ray 16224 or the light of scattering The reduction of contrast in the bright areas or image of shown image peripheral is visible as to user.The present invention provides various methods Reflection and scattering light are reduced by stopping or trimming the reflection from side wall or scatter light to reduce bright areas.
Figure 163 shows the diagram of the first embodiment of the present invention, and wherein baffle 16302 is added in shell 16222 Side, between illumination beam splitter device 16210 and lens 16212.Baffle 16302 is before rim ray 16224 is passed to lens 16212 Blocking or trim edge light 16224.Baffle 16302 can be made of opaque any material, so that rim ray 16224 It is blocked or is trimmed to about.In a preferred embodiment, baffle 16302 can be made of the black material with matt finishing coat, so that Incident light is obtained to be absorbed by baffle.Baffle 16302 can be made of the plate material with the hole being located in shell 16222, or Baffle 16302 can be made into a part of shell 16222.Be placed in a distance from image source 16,208 1 due to baffle 16302 and Image light is diverging, therefore hole caused by surrounding baffle 16302 is bigger than the zone of action of image source 16208, so figure The image that image source 16208 provides will not be trimmed in edge by baffle, as a result, entirely scheming provided by image source 16208 As that can be seen by the eyes of user, as shown in Figure 163.In addition, baffle is preferably equipped with thin cross section (as shown in Figure 163) Or sharp edges, so that light will not be from the edge scatter of baffle.
Figure 164 is shown in which to be added to the another embodiment of the present invention of baffle 16402 at the plane of incidence of lens Diagram.Baffle 16402, which can be made into a part of shell 16222 or baffle 16402, can be applied on lens 16212 Cover.In any case, baffle 16402 should be opaque, and preferably with the not black of gloss finishing coat, with resistance It keeps off and absorbs incident light.
Figure 165 shows the diagram of one embodiment of the invention similar to embodiment shown in Figure 164, in addition to baffle On the outlet side of lens 16212.In this embodiment, baffle 16502 is provided to pass through lens in rim ray 16224 Blocking or trim edge light after 16212.
Figure 166 is shown in which that baffle 16602 is attached to shell between lens 16212 and imaging beamsplitter 16220 The diagram of 16222 another embodiment of the present invention.Baffle 16602 can be a part or baffle of shell 16222 16602 can be the separate structure in shell 16222.Baffle 16602 stops or trim edge light 16224, so that Bright areas is not provided to the eyes 16228 of user in shown image peripheral.
Figure 167 shows wherein absorber coatings 16702 and is applied in the side wall of shell 16222 to reduce incident light and edge-light The diagram of the another embodiment of the present invention of 16224 reflection and scattering.Absorber coatings 16702 can with baffle 16302,16402, 16502 or 16602 combinations.
Figure 168 shows the diagram in another source of stray light in head-mounted display, and wherein stray light 16802 is directly from light The edge in source 16204 enters.This stray light 16802 may especially become clear, because it is directly from light source 16204 without head It first reflects from illumination beam splitter device 16210 and is then reflected from image source 16208.Figure 169 shows the stray light from light source 16204 The diagram in 16902 another source, wherein stray light is from the surface reflection of image source 16208, and polarization state is changed there And then stray light 16902 can pass through illumination beam splitter device according to the angle of comparable steepness.This stray light 16902 then can be from outer The edge reflections of any reflective surface or lens 16212 in shell, as shown in Figure 169.Figure 170 is shown in which and light source 16204 have been disposed adjacent the diagram of the one more embodiment of the present invention of baffle 17002.Baffle 17002 is opaque and from light Source 16204 is stretched out, so that stray light 16802 and 16902 is blocked or trims after light source 16204, is thus prevented The eyes 16228 of stray light arrival user.
In another embodiment, baffle or coating shown in Figure 163-167 and 169-170 are combined further to subtract Thus stray light in few head-mounted display reduces the bright areas of shown image peripheral or increases shown image In contrast.Multiple baffles can be used between light source 16204 and imaging beamsplitter 16220.In addition, as shown in Figure 171, tool Having the absorber coatings 1702 of ridge can be used, wherein a series of small ridges or step take on a series of baffles to stop or trim shell Rim ray in 16222 entire sidewall areas.Ridge 17102 can be made into a part of shell 16222, or as separately One layer of inner sidewall for being attached to shell 16222.
Figure 172 shows the another embodiment of belt or thin slice 17210, and belt or thin slice include that can be used for as shown in Figure 171 The slide glass 17212 and ridge 17214 of blocking reflected light.Ridge 17214 non-straight overturning angle and sharp keen inclination on another side on side, So that being blocked from the incident light that sharp keen inclined side enters.Ridge 17214 can be with the triangular cross-section with a sharp edge The solid ridge in face, as shown in Figure 172 perhaps they can be attached to an edge thin inclination ruler or they can be with It is attached to the inclined fiber of one end, so that a surface forms angle relative to side wall, and incident light is blocked.Belt is thin The advantages of piece 17210, is that ridge 17214 can be relatively thin, and ridge can covering shell 16222 main region.Belt is thin Piece 17210 another is the advantage is that ridge 17214 is more easily made than ridge shown in Figure 171, ridge shown in Figure 171 It is likely difficult to be molded into a part of shell.
In all embodiments, surrounding baffle can cause hole, and the size in hole corresponds to them along optical axis from image source Distance so that image light can be dissipated along optical axis, thus to the eyes of user 16228 provide image source 16208 without repairing The view cut.
In one embodiment, the absorption polarizers in optics assembly be used to reduce stray light.Absorption polarizers It may include anti-reflection coating.Absorption polarizers can be placed in after the condenser lens of optics assembly, pass through optics to reduce The light of the optically flat film of assembly.Light from image source can be polarized to increase contrast.
In one embodiment, the anti-reflection coating in optics assembly can be used for reducing stray light.Anti-reflection coating can It is placed on the polarizer of optics assembly or optics assembly delays on film.Film is delayed to can be quarter-wave film or two / mono- wave film.Anti-reflection coating can be placed on the outer surface of partial reflection eyeglass.Light from image source can be polarized To increase contrast.
With reference to Figure 102 A, image light is directed to the beam splitter layer of optics assembly by image source 10228.Figure 103 is depicted The amplification of image source 10228.In this particular embodiment, image source 10228 is shown to include light source (LED strip 10302), light source Light is orientated, bending wire-grid polarizer 10310 is reached by diffuser 10304 and Prepolarization device 10308, light is anti-there It penetrates to LCoS display 10312.Then image light from LCoS passes through bending wire-grid polarizer 10310 and half wave film 10312 are reflected back the beam splitter layer of optics assembly 10200.In embodiments, including optical module 10204,10210, 10212,10212,10230 optics assembly may be provided as the optics assembly of sealing, such as detachable (for example, buckle And open), it is replaceable etc., and image source 10228 can be used as the black box in eyepiece frame and be provided.This allows sealing Optics assembly waterproof and dustproof, replaceable, customizable etc..For example, given sealing optics assembly can be equipped with for one The corrective optics assembly of people, and it is available another with different corrective optics requirements (for example, different prescriptions) The second of one people seals optics assembly to replace.In embodiments, it is understood that there may be need not eyes all receive from eyepiece The application of input.In this case, a people can simply dismantle side, and using only it is unilateral come projection for content.With This mode, user can have the optical path that do not blocked for eyes now, and wherein assembly has been removed, and eyepiece will be only one Half system saves battery life etc. in the case where running.
Optics assembly can be considered just being sealed about what part and be divided into separated part, such as be given birth to by image It is constituted at tool 10228 and indicative optical tooling 10204,10210,10212 and 10230, as shown in Figure 102 A.Another In diagram, Figure 147 shows the embodiment that indicative optical device is shown as to the eyepiece of ' projection screen ' 14608a and 14608b Configuration.The part of eyepiece electronic device and optical projection system 14602 is also shown in Figure 102 A, and wherein this part of optical projection system can quilt Referred to as image Core Generator.Image Core Generator and indicative optical tooling can be the subassemblies of sealing, such as inciting somebody to action Optical device is launched wherein not by the invasion of the pollutant in ambient enviroment.In addition, indicative optical device can be removed, it is all Such as replacing, for removing with allow that user do not blocked check, removed by force for adapting to non-destructive (for example, its In indicative optical device be knocked and from the main body of eyepiece be detached from without damage) etc..In embodiments, the present invention can Including the interaction wear-type eyepiece that user wears, wherein eyepiece includes that (by the optics assembly, user checks optics assembly Ambient enviroment and shown content) and integrated image source (be suitable for content is introduced into optics assembly), wherein optics group Piece installing includes the image Core Generator being mounted in the frame of eyepiece and before eyes of user and can be from the frame of eyepiece The indicative optical tooling removed in frame, wherein image Core Generator is sealed in frame to reduce the dirt from ambient enviroment Dye.In embodiments, sealing can be the optical window of sealing.As described here, eyepiece may also include handling implement, electric power Management tool, removal sensor, battery etc., wherein electrical management tool can indicate to detect by the dismounting from removal sensor The dismounting of indicative optical tooling, and reduce the electric power of each component of eyepiece selectively to reduce the electric power of battery consumption.For example, electric The component that power reduces can be image source, reduce the brightness of image source, the power supply for closing image source etc., wherein electrical management Tool can monitor the attachment again of indicative optical tooling, and the electricity usage of image source is reverted to the working water before removing It is flat.Indicative optical tooling can be removed by disengaging mode, so that when indicative optical tooling is removed unintentionally, its meeting It is removed in the case where not damaging eyepiece.Indicative optical tooling can be dismountable by connection mechanism, such as magnet, bolt, Rail, snap-on connector etc..Indicative optical tooling can provide the vision correction to the user for needing corrective glasses, wherein indicating Property optical tooling can for change eyepiece vision correction prescription purpose and be replaced.Eyepiece can have for each eye Two sseparated dismountable optics assemblies, wherein separated one of optics assembly is removed to allow for separated optics to assemble Remaining one simple eye use in part.For example, simple eye use, which can be gun, aims at use, instruction is wherein removed in eyepiece Property optical tooling side be used for gun aiming, thus allow user for gun aim at the visual pathway not blocked, And retain tool provided by eyepiece to another eye simultaneously.Indicative optical tooling can be removed to allow to make suitable for interior The replacement of indicative optical tooling and the indicative optical tooling for being suitable for outdoor application.For example, using comparison for interior Outdoor application may have different filters, the visual field, contrast, shielding etc..It is attached that indicative optical tooling can be adapted receiving Canadian dollar part, optical element, mechanical organ, adjustment element etc..For example, optical element can be inserted at the optics to user Side is adjusted.Indicative optical tooling can also be replaced to change the provided visual field, such as by with the second view Wild indicative optical tooling replacement has the indicative optical tooling in first visual field.
Unpolarized light is provided with reference to Figure 104, LED.Diffuser spreads the light from LED and light is made to homogenize.It absorbs Light is transformed to S-polarization by Prepolarization device.S polarized light is then bent over wire-grid polarizer and reflects towards LCOS.LCOS reflects S-polarization Light, and depend on local image content and be converted into P-polarized light.P-polarized light becomes P polarization by bending wire-grid polarizer Image light.P-polarized light is transformed into S polarized light by half wave film.
Referring again to Figure 102 A, beam splitter layer 10204 is that polarization beam apparatus or image source provide polarization image light 10208 and beam splitter layer 10204 is polarization beam apparatus so that the image light 10208 reflected is linearly polarized photon, this implementation Example and associated Polarization Control are shown in Figure 102 A.The beam splitter layer to wherein image source offer linear polarization image light 10204 the case where being polarization beam apparatus, the polarization state of image light is aligned with polarization beam apparatus, so that image light 10208 is inclined Shake beam splitter reflection.The image light reflected is shown as polarizing with S state by Figure 102 A.Beam splitter layer 10204 is polarization wherein In the case where beam splitter, the first quarter-wave film 10210 is arranged on beam splitter layer 10204 and partial reflection eyeglass Between 10212.Linear polarization image light is transformed into circular polarization image light (in Figure 102 A by the first quarter-wave film 10210 It is illustrated as S and is just transformed into CR).Then the first part reflected in image light 10208 is also circularly polarized, wherein circular polarization shape State is inverted and (is illustrated as CL in Figure 102 A), so that (being shown with the polarization state of image light 10208 provided by image source It is compared for S), after being passed back by quarter-wave film, the polarization state quilt for the first part reflected in image light 10208 It keeps (for P polarization), as a result, the first part reflected in image light 10208 is by polarization beam apparatus without reflecting Loss.It has an X-rayed display assembly 10200 when beam splitter layer 10204 is polarization beam apparatus and includes the first quarter-wave film When 10210, light control element 10230 is the second quarter-wave film and linear polarization 10220.In embodiments, light-operated Element 10230 processed includes controllable darkening layer 10214.Wherein the second quarter-wave film 10218 is by the image light of circular polarization The image light 10208(that 10208 second part is transformed into linear polarization is illustrated as CR and is just transformed into S), the figure of linear polarization As the polarization state that there is light the linear polarization 10220 being independently controlled by light in element 10230 to stop, subtracted so that eyes shine It is few.
When light control element 10230 includes linear polarization 10220 and quarter-wave film 10218, before user The incoming unpolarized scene light 10222 of the external environment in face is transformed into linearly polarized photon and (is illustrated as P in Figure 102 A Polarization state), and 50% light is blocked.In scene light 10222 by the first part of linear polarization 10220 be it is linear partially Shake light, these light are transformed into circularly polarized light (be illustrated as P in Figure 102 A and be just transformed into CL) by quarter-wave film.Scene light In the Part III that is reflected from partial reflection eyeglass 10212 have reversed circular polarization state (be illustrated as in Figure 102 A from CL is transformed to CR), then these light (are shown in Figure 102 A by 1 film 10218 of the second quarter-wave transformation linear polarized light S-polarization is transformed into for CR).Then linear polarization 10220 stops the Part III reflected in scene light, thus reduce escape Light and reduce eyes shine.
The first part reflected in image light 10208 as shown in Figure 102 A and second be transmitted in scene light Divide circular polarization state (being illustrated as CL) having the same, is converted so that they organize to merge by the first quarter-wave film 10210 Linear polarized light (is illustrated as P), and when beam splitter layer 10204 is linear beam splitter, linearly polarized photon passes through beam splitter.Linearly Then polarization combination light 10224 provides constitutional diagram to the eyes 10202 of the user at the back for being located at perspective display assembly 10200 Picture, wherein combination image by the external environment before the shown image and user from image source see-through view superposition Part is constituted.
Beam splitter layer 10204 includes optically flat film, all Asahi TAC films as described here.Beam splitter layer 1024 can It is placed in front of the eyes of user according to an angle, so that beam splitter layer reflection and corresponding portion and the transmission of transmission image light The scene light of see-through view from ambient enviroment, so that providing to the eyes of user by image light and the scene light that is transmitted The combination image of each section composition.Optically flat film can be polarizer, such as wire-grid polarizer.Optically flat film can be laminated To transparent substrates.Optically flat film can be molded, be molded, gluing etc. is until in the surface of one of optical surface of eyepiece or table On face, such as beam splitter 10202.Optically flat film can be set to vertical line less than 40 degree.Bending polarizing coating can have less than 1:1 Light source height to the ratio of the width of irradiated area.The highest point of bending film is lower than the length of the most narrow axis of display.? In each embodiment, once optical thin film is located on beam splitter, additional optical device, such as corrective optical device, prescription light Learning device etc. can be added on surface, such as in order to make to keep flat in the interlayer of film therebetween.
The present invention also provides for providing the method on the optically flat surface with optical film.Optical film be to be formed have with One optical texture of the very different optical characteristics of the rest part of the structure of imaging device facilitates method.In order to be set to imaging Standby to provide function, optical film needs to be attached to optical device.When optical film in a manner of reflexive by use, crucially anti- Penetrating property surface be it is optically flat, otherwise the wavefront of the light of self-reflection surface reflection will not be kept and picture quality will be by Degrade.Optically flat surface can be defined as, when the wavelength measurement of the light used for imaging device, and with flat surfaces or required Any one in optical curve compares, the uniform surface within 5 optical wavelength of per inch on surface.
Optically flat surface including optical film as described in the present invention can be included in display system, display system It include: projector, projection TV set, near-eye display, head-mounted display, see-through display etc..
Figure 140 shows the example of the display system with optically flat reflective surface, and optically flat reflective surface is The beam splitter being made of the optical film on substrate, wherein display system is near-eye display 14000.In this example, image source 14010 include optical projection system (not shown) using the optics for including the folding optical axis 14000 being located in near-eye display 14014 Layout provides image light.Optical device along optical axis 14014 may include focusedimage light to provide to the eyes 14002 of user From the lens of the focusedimage of image source 14010.Beam splitter 14004 by optical axis 14014 from image source 14010 fold into spherical surface or Spherical reflector 14008.Beam splitter 14004 can be partial reflection eyeglass or polarization beam apparatus layer.Near-eye display Beam splitter 14004 in 14000 be directed to an angle with will at least part from the image light of image source 14010 again It is directed to reflector 14008.From reflector 14008, at least another part of image light is reflected back toward the eyes 14002 of user. Another part of the image light reflected passes beam splitter 14004 back and focuses on the eyes 14002 of user.Reflector 14008 can To be eyeglass or part lens.In the case where reflector 14008 is part lens, before near-eye display 14000 The scene light of scene can be combined with image light, thus to the eyes of user 14002 present by along axis 14014 image light and The combination image light 14018 constituted along the scene light of axis 14012.Scene is presented to the eyes of user in combined image light 14018 And the combination image of the covering image from image source.
Figure 141 shows the diagram of nearly eye display module 14100.Module 14100 is by reflector 14104, image source module 14108 and beam splitter 14102 constitute.Module can be using attachment in side opening, and attachment is located at reflector 14104, image source Between at least some of connection edge between module 14108 and beam splitter 14102.Alternatively, module 14100 can pass through side wall It is closed in side, to provide the closed module for the inner surface for preventing dust, dirt and water contact module 14100.Reflector 14104, image source module 14108 and beam splitter 14102 can be manufactured separately and then be attached at together, or wherein at least one It can be fabricated together in the subassemblies of connection a bit.In module 14100, optical film can be used in beam splitter 14102 or anti- In emitter.In Figure 141, beam splitter 14102 is illustrated as flat surfaces, and reflector 14104 is illustrated as spherical surface.It is shown in nearly eye In module 14100, both reflector 14104 and beam splitter 14102 can be used to provide image to the eyes of user, such as scheme Shown in 140, therefore surface is optically flat or optical-quality homogeneous is important.
Figure 142 shows one embodiment of the invention --- the schematic diagram of thin skin pattern film assembly 14200.Thin skin pattern Film assembly 14200 includes the frame 14202 being made of frame member 14202a and lower frame component 14202b.Optical film 14204 It is maintained between framing component 14202a and 14202b using adhesive or fastener.In order to improve the flat of optical film 14204 Smooth property, optical film 14204 can stretch in one or more directions, while adhesive is applied, framing component 14202a and 14202b is adhered to optical film 14204.After optical film 14204 is adhered to frame 14202, the edge of optical film can quilt Finishing is to provide smooth surface to the outer edge of frame 14202.
In some embodiments of the invention, optical film 14204 is a series of folded membrane being made of optically flat surfaces, And the interface of frame element 14202a and 14202b have matched collapsed shape.Then folded membrane is stretched simultaneously along the direction folded Position is bonded it to, so that framing component 14202a and 14202b keep optical film 14204 in collapsed shape, and a series of Each surface in optically flat surface is maintained at appropriate location.
In all situations, after framing component 14202a and 14202b are adhered to optical film 14204, what is obtained is thin Skin pattern film assembly 14200 is in the optical device for can be placed in such as close eye display module 14100 to form beam splitter 14102 rigid assembly.In this embodiment, thin skin pattern film assembly 14200 is in nearly eye display module 14100 14102 assembly of replaceable beam splitter.The slot that side wall in nearly eye display module 14100 can have frame 14202 to be caught in, or It alternatively can provide the flat surfaces of connection side wall and frame 14202 can be placed at the top of flat surfaces.
Figure 143 is the diagram for being inserted into molding assembly 14300 for including optical film 14302.In this embodiment, optics Film 14302 is placed in mold, viscosity plastics materials by molded door 14308 by injection mold so that filling plastic mold Type chamber simultaneously forms the molded structure 14304 for adjoining optical film 14302 and being located at after optical film 14302.When plastic material is in mould It is hardened in tool, opens mold along seam line 14310, and remove insertion molding assembly 14300 from mold.Optical film 14302 Then it is embedded into and is attached to insertion molding assembly 14300.In order to improve the optical film in insertion molding assembly 14300 14302 optically flat property, the inner surface for placing the mold of optical film 14302 is optically flat surface.By this method, viscosity modeling Expect that material during molding process, forces optical film 14302 against the optically flat surface of mold.This technique can be used for mentioning For flat or with required optical curve optically flat surface as described above.In another embodiment, optical film 14302 can match Adhesive phase or tie layer are had, to increase the adherency between optical film 14302 and molded structure 14304.
In another embodiment, optical film 14302 is placed in mold, and between die surface and optical film 14302 With protectiveness film.Protectiveness film can be attached to optical film 14302 or mold.Protectiveness film is smooth than die surface or flat It is smooth, so as to provide smoother or flat surface to the optical film 14302 that it is moulded.Therefore, protectiveness film, which can be, appoints What material, such as plastics or metal.
Figure 144 shows the diagram of the laminating technology for making laminate of optical film 14400.In this embodiment, Upper lower platen 14408a and 14408b be used to for optical film 14400 being laminated on substrate 14404.Adhesive 14402 can be appointed Choosing is using to be adhered to optical film 14400 for substrate 14404.In addition, one or more of pressing plate 14408a and 14408b can It is heated or substrate 14404 can be heated to provide higher degree of adhesion between substrate 14404 and optical film 14400.To lining One or more heating may be alternatively used for softening substrate 14404 in bottom or pressing plate 14408a and 14408b, thus in optical film Pressure more evenly is provided after 14400 to improve flatness or flatness of the optical film 14400 in laminate.This implementation The laminate with optical film 14400 of example is used as above being directed to nearly eye described in thin skin pattern film assembly 14200 Learn the replaceable beam splitter in module 14100.
Figure 145 A-C is shown for the application work with the optical surface production molded structure 14502 for including optical film 14500 The diagram of skill.In this embodiment, optical film 14500 is applied in molded structure 14502 with rubber applicator 14508 Optically flat surface 14504.Adhesive phase can be applied to optically flat surface 14504 or the optical film of molded structure 14502 Optical film 14500 is adhered to molded structure 14502 by any of 14500 bottom surface.Rubber applicator 14508 can To be the relatively soft and flexible material with curved surface, so that the central part of optical film 14500 is forced to contact The optically flat surface 14504 of molded structure 14502.When rubber applicator 14508 further pushes down on, optical film 14500 Contact area size between the optically flat surface 14504 of molded structure 14502 increases, such as Figure 145 A, 145B and 145C Shown in.This progressive application process provides the highly uniform application of pressure, this allows the air of interface applying It is ejected during journey.The optically flat surface 14504 of progressive application process and molded structure 14502, which provides, is attached to molding knot The optically flat optical film 14500 of the inner surface of structure 14502, as shown in Figure 145 C.For optical film 14500 to be adhered to mould The adhesive phase of structure 14502 processed can be attached to the optically flat surface on 14502 inside of optical film 14500 or molded structure 14504.It will be understood to those skilled in the art that this application process similarly can apply optics in the outer surface to molded structure Film.In addition, optically flat surface can be flat surfaces or surface or a series of optically flat tables with required optical curve Face, wherein rubber applicator is shaped, to provide the progressive application of pressure with the application of optical film.
In embodiments, image display system may include the optically flat optical film comprising display module shell, wherein Shell includes that optical film is made to keep optically flat substrate, image source and check position, wherein image provided by image source from Optical film, which is reflected into, checks position.In embodiments, the optical film of image display system can be molded into display module.? In each embodiment, optical film can be applied to display module.In addition, in embodiments, the optical film of display system can be Wire-grid polarizer, eyeglass, part lens, holographic film etc..In embodiments, image display system can be near-eye display. In embodiments, optical film is molded into display module, or when optical film is molded into display module, can be opposite Optical film is kept in optically flat surface.In embodiments, the optical film of image display system may include 5 light waves of per inch Long optical flatness.
In one embodiment, the image display system including optically flat optical film may include that optical film is made to keep optics flat Smooth substrate, display module shell, image source and check position, wherein image provided by image source is reflected into from optical film looks into It sees position, and the substrate with optical film can be replaced in display module shell.In such embodiments, image display system Substrate can be frame, optical film can be kept under the tension of frame, and substrate can be the subsequent molded panel of film, and/or Substrate can be laminate.In addition, the optical film of image display system can be beam splitter, polarization beam apparatus, wire-grid polarizer, Eyeglass, part lens, holographic film etc..In addition, image display system can be near-eye display.In embodiments, when scheming As display system optical film behind molded panel when, the optical film can be kept against optically flat surface.In addition, in each embodiment In, when plate is in turn laminated to the optical film of image display system, optical film can be kept against optically flat surface.In each implementation In example, the optical film of image display system may include the optical flatness of 5 optical wavelength of per inch.
In one embodiment, the component in Figure 102 A collectively forms electro-optical module.Optical axis associated with display Angle can be 10 degree or more vertical.This gradient refers to the degree that leans forward on optical module top.This allows beam splitter Angle is reduced, and beam splitter angle reduces so that optical module is thinner.The height of polarizing coating is bent to reflective image display Width ratio be less than 1:1.Curve on polarizing coating determines the width of the irradiated area in reflective display, and is bent The inclination in region determines positioning of the irradiated area in reflective display.Polarizing coating is bent by the irradiation of the first polarization state Light is reflected into reflective display, this changes the polarization of irradiation light and generates image light, and it is anti-to be bent polarizing coating transmitting institute The image light penetrated.Being bent polarizing coating includes parallel with reflective display a part on light source.The height of image source can be with It is the 80% of at least display zone of action width, at least 3.5mm or less than 4mm.
In portable display system, it is important to provide bright, compact and light weight display.Portable display system Including mobile phone, laptop computer, tablet computer, near-eye display and head-mounted display.
The present invention is provided as portable display system and provides compact and light weight headlight, and headlight is by partial reflection film structure At to redirect the light from edge light to irradiate reflective image source.Partial reflection film can be part lens beam splitter Film or polarization beam apparatus film.Polarization beam apparatus film can be multi-layer dielectric film or wire-grid polarizer film.Known polarization beam splitter Film provides the high-efficiency reflective to a polarization state and another polarization state is allowed to pass through simultaneously.Multi-layer dielectric film can be from beauty 3M company, Minneapolis, Minnesota city, state is bought with the title of DBEF.Wire grid polarization film can be from Tokyo city Asahi-Kasei E-Materials company is bought with the title of WGF.
Edge-light provides compact light source for display, but since it is located at the edge of image source, light must be redirected 90 Degree is with irradiation image source.When image source is reflective image source, when such as liquid crystal over silicon (LCOS) image source, irradiation light must quilt Polarization.Polarised light is by the surface reflection of image source, and the polarization state of light accordingly changes with the picture material shown.Institute Then the light of reflection is passed back by headlight.
The prior art that Figure 187 shows with solid beam splitter block 18718 as headlight shows showing for assembly 18700 Intention is shown.Display assembly includes headlight, one or more light sources and image source.In display assembly 18700, one or more A light source 18702 is included to provide the light for being illustrated as light 18712.Light source can be LED, fluorescent lamp, OLED, incandescent lamp or Solid state lamp.Light 18712 is by diffuser 18704 with deflection dispersion light to obtain irradiation more evenly.If the light of diffusion It is polarized, then diffuser includes linear polarization.Diffused ray 18714 passes through solid beam splitter block 18718 towards partial reflection Layer 18708 is emitted, and is reflected in 18708 diffused ray of partial reflection layer towards 18720 part of reflective image source.Diffused ray Then 18714 are reflected by reflective image source 18720, image light 18710 is consequently formed, image light the being partially reflected property layer 18708 transmissions.Then image light 18710 can be passed to associated image forming optics (to show) and be schemed with presenting to viewer Picture.However, as in Figure 187 as it can be seen that being illustrated herein as the height by light area of the light source of diffuser 18704 and illuminated Reflective image source 18720 it is of same size.Partial reflection layer 18708 is placed in 45 degree of angles to provide image light 18710, the image light straight line or vertically proceed to associated image forming optics.As a result, shown in Figure 187 Headlight is relatively large in size.
In imaging systems, it is however generally that, keep the wavefront from image source to provide with fine resolution and comparison The high quality graphic of degree is important.Therefore, as it is known by the man skilled in the art, image light 18710 must and reflective image Orthogonally advance to provide uniform wavefront to associated image forming optics, to obtain being provided to viewing in source 18720 The high quality graphic of person.Therefore, diffused ray 18714 must being partially reflected property film 18708 redirect with reflective image Source 18720 is orthogonal, so that they can be reflected by vertical (as shown in Figure 187-198) and be passed to associated image optics device In part.
Figure 188 shows another prior art and shows assembly 18802, which includes partial reflection film 18804, should It is independent unsupported that film, which is supported at edge and on reflective image source 18720,.The display assembly is to be similar to figure Show that the mode of assembly works shown in 187, difference is to show assembly 18802 due to not having solid beam splitter block 18718 and it is lighter than display assembly 18700 in weight.As in Figure 188 as it can be seen that the height of diffuser 18704 again with reflection Property image source 18720 it is of same size to provide image light 18808, the image light by reflective image source 18720 reflect when It vertically advances in associated image forming optics.
Figure 187 is shown if partial reflection film 18804 is located at the angle less than 45 degree in display assembly 18902 Light can occur what signal diagram.In this case, each section in reflective image source 18720 is not illuminated uniformly. Irradiate the light of part farthest from diffuser in reflective image source or without straight ahead to associated image optics In device (such as in the case where light 18904), or before just from the surface reflection in reflective image source (such as light 18908 In situation), this can change polarization state, and if partial reflection film is polarization beam apparatus film (also referred to as reflective polarizer Film), then then light passes through the film.Therefore, when associated image forming optics only can be used from reflective image source 18720 When the image light of straight ahead, when partial reflection film 18804 is located at the angle less than 45 degree, reflective image source 18720 In illuminated region be reduced, the dark portion of corresponding map picture generates.
In one embodiment of the invention shown in Figure 190, bending part reflective surface 19004 is provided with by light The diffused light 19010 that source 18702 provides is redirected to irradiate reflective image source 18720 downwards.Curved partial reflection Surface 19004 can be polarization beam apparatus film, which is thin and flexible.In this case, diffuser 19704 includes linear Polarizer, so that light 18712 is diffused and then is linearly polarized, so diffused light 19010 is polarized.In diffuser 18704 Linear polarization and polarization beam apparatus film 19004 be oriented such that it is anti-by polarization beam apparatus film by the light of linear polarization It penetrates.By this method, when reflective image source 18720 changes the polarization of diffused light 19010, the image light 19008 that is reflected Polarization is opposite polarization state compared with diffused light 19010.The image light 19008 and then passing through partially reflective property film reflected 19004 and continue to display optics.By using flexible polarization beam apparatus film as partial reflection surface 19004, Partial reflection surface 19004 can be bending and light weight.Polarization beam apparatus film takes on irradiation reflective image source 18720 The dual role of the transparent component of the image light 19008 of the reflector and reflection of diffused light 19010.Such as those skilled in the art Known, advantage provided by polarization beam apparatus film is that they can receive light in large-scale incidence angle, so that curve is not The light by reaching film can be interfered.Further, since polarization beam apparatus film is thin (for example, less than 200 microns), curved shape is not It can make significantly image light distortion when image light 19008 reaches display optics by film.Finally, polarization beam apparatus film enables The tendentiousness of light scattering is very low, so hi-vision contrast is maintained.
The flexible nature of polarization beam apparatus film allows them to be formed curved shape, and curved shape is by the light from diffuser It redirects and focuses on reflective image source.The light distribution that the shape of the curve of polarization beam apparatus film can be provided based on diffuser And select, to provide the uniform irradiation to reflective image source.Figure 190 shows the reflectivity of the bending part with parabolic shape Film 19004, but the property of light source 18702 and the validity of diffuser 18704 are depended on, radiation curve, complicated spline curve, phase Flat curve, plane or sectional plan also are likely used for equably redirecting diffused light 19010 and be focused it onto anti- In penetrating property image source 18720.Experiment shows the curved surface on partial reflection surface 19004 often by 19010 meeting of diffused light Gather in the center in reflective image source 18720, so that curved surface provides edge more bright light in diffuser 18704 Optimal use is obtained when distribution.On the contrary, experiment shows to be a relatively flat surface on partial reflection surface 19004 in diffuser 18704 obtain optimal use when providing light distribution more bright at center.When partial reflection surface 19004 is by flexible membrane When composition, the shape on partial reflection surface can be maintained with side frame, and side frame has the groove of suitable profile to keep flexible membrane In in position, as being shown as independent free-standing film in Figure 190.Two side frames are used to together with other components in display group Support bends shape is made of because of the significant portion of display assembly 19002 air on the either side of piece installing 19002, and part Reflective surface 19004 is film, and compared with the prior art shown in Figure 187 shows assembly 18700, weight will be light It is more.In addition, the width such as in Figure 190 as it can be seen that illuminated reflective image source 18720 is greater than the height of diffuser 18704, with So that display assembly 19002 shows that assembly is more compact than the prior art shown in Figure 188.
Figure 191 shows another embodiment of the present invention, wherein double light sources 19104 are used in display assembly 19102, In the partial reflection surfaces of two relatively flats be placed back-to-back.The headlight with two sides is arranged in shown in Figure 191 Middle offer solid film retainer 19120 so that display assembly 19102 be similar to using two be arranged back-to-back as figure Assembly 18700 is shown shown in 187.In Figure 191, show light only for side, but the component of the other side and Light is symmetrical with shown side.It is the partial reflection being extended continuously between two sides in solid film retainer 19120 Film 19110.Solid film retainer 19120 is also continuous between two sides, so that image light 19112 is not displayed assembly Jointing line between 19102 two sides interrupts or deflection.Solid film retainer 19120 and partial reflection film 19110 mention together For constant optical thickness, so image light is not deflected or distorts.Therefore, with the image light 19112 of consecutive image quality It can be provided while being irradiated by the light from two light sources 19104.Each light source 19104 provides light to diffuser 19108 19114, the diffuser deflection dispersion light 19114 with diffused light 19118 is provided so as to irradiate reflective image source 18720 one Half.Partial reflection film 19110 is maintained in required shape by solid film retainer 19120.Most of all, working as and reflectivity When the irradiating width of image source 18720 is compared, the height of diffuser 19108 is reduced in Figure 187 for display assembly The half of prior art diffuser 18704 shown in 18700.
Figure 192 shows with double light sources 19104 and is supported only at edge independent without support section reflective membrane The signal diagram of 19204 display assembly 19202.In Figure 192, light is shown only for side, but the other side Component and light and shown side are symmetrical.Group shown in the function and Figure 191 of the various components of display assembly 19202 Part function is identical, but the benefit for having display assembly 19202 lighter than display assembly 19102 in weight, because The major part of display assembly 19202 is made of air.
Figure 193 shows with double light sources 19104 and is supported only at edge independent without support section reflective membrane 19308 display assembly 19302, so that two curved surfaces are provided.In Figure 193, light is shown only for side Line, but the component of the other side and light and shown side are symmetrical.Partial reflection film 19308 is continuous on two sides , there is similar bending on two sides.Bending is selected to diffused light 19312 provided by reflected diffusion body and by diffused light It focuses on reflective image source 18720.18720 reflected diffusion light 19312 of reflective image source, is consequently formed image light 19310.The height of diffuser 19304 is less than the half of prior art diffuser 18704 shown in Figure 187, so that headlight It is very compact with display assembly 19302.
Figure 194 shows the display assembly with the continuous part reflective membrane 19308 in solid film retainer 19404 19402 signal diagram, display assembly 19402 are similar to shown in Figure 193 in other aspects and show assembly 19302. In Figure 194, light is shown only for side, but the component of the other side and light and shown side are symmetrical.Solid Film retainer 19404 is used on the either side of partial reflection film 19308, film is maintained in specified two sides curve, And protect part reflective membrane 19308.Among bottom of the two sides of solid film retainer 19404 by solid film retainer 19404 In relatively thin part connection, the jointing line of the image light 19310 of picture centre will be destroyed to further avoid providing.
In a preferred embodiment of the invention, the partial reflection film in assembly is shown shown in Figure 191-194 is Polarizing beam splitting film.In these embodiments, diffuser includes linear polarization, so that diffused light is polarized.Linear polarization It is aligned with polarization beam apparatus film, so that diffused light has the polarization state reflected by polarization beam apparatus film.Polarization beam apparatus film Also act as the analyzer of image light.It is using polarization beam apparatus film using the advantages of polarization diffused light in headlight, aobvious Show that stray light is reduced in assembly, because all polarization diffused lights are reflected by polarization beam apparatus film towards reflective image source, Diffused light is polarized at reflective image source is transformed into image light.If diffused light is not polarized, the diffusion that do not reflected The polarization state of light will be transmitted by polarization beam apparatus film, if this light is not affected by control, it will be provided to image light Light is scattered, this can reduce the contrast into the image that viewer is presented.
Figure 195 shows with the single light source 19104 on side and polarizes control effectively to irradiate reflexive figure from two sides The signal diagram of the display assembly 19502 of image source 18720.In this case, light source 19104 provides unpolarized light 19114 and unpolarized diffused light 19508.Specific reflective membrane is the polarization beam apparatus in solid film retainer 19514 Film 19504.One polarization state (being illustrated as light 19510) of 19504 reflected diffusion light of polarization beam apparatus film while transmission is another One polarization state (is illustrated as light 19518).Polarization beam apparatus film 19504 be fold and continuous so that have it is another partially The two sides that the light of vibration state 19518 passes through folding polarization beam apparatus film 19504.Then this light 19518 passes through quarter-wave Delay film 19524, which delays film that polarization state is changed to circular polarization from linear.Circularly polarized light is then by eyeglass 19528 reflect and delay film 19524 to pass back by quarter-wave, and quarter-wave delays film to change polarization state from circular polarization For linear polarization but there is a kind of polarization state (being illustrated as light 19520), so that light 19520 is then by polarization beam apparatus film 19504 reflect towards reflective image source 18720.Therefore, light phase provided by the light source 19104 in assembly 19502 is shown Light with polarization state irradiates reflective image source 18720 on two sides.Since diffused light 19508 is not polarized, and two kinds inclined Vibration state (19510,19518) be used to irradiate reflective image source 18720, and substantially all light provided by light source is become Change image light (19512,19522) into.Image light (19512,19522) is directly provided to associated image forming optics. Once again, the height of diffuser 19108 is the half of diffuser 18704 shown in Figure 187, thus provide compact and efficient Headlight and display assembly.
Figure 196 is shown with the display assembly 19602 for being similar to the arrangement of geometry shown in Figure 195, but is polarized and divided Beam device film 19604 is independent unsupported, and is only supported at edge, to reduce the weight of headlight, while still being provided opposite The lower diffuser height of width in illuminated reflective image source.
Figure 197 shows one more embodiment of the present invention, including has double light sources 19704 and 19708 and fold polarization point The display assembly 19702 of beam device film 19714, wherein the two sides for folding polarization beam apparatus film 19714 are curved.From light source 19704,19708 light 19718,19720 is not polarized, and diffuser 19710,19712 does not include polarizer, to diffuse Light 19722,19724 is also not polarized.The bending of polarization beam apparatus film 19714 and angled each side are by the one of diffused light A polarization state (being illustrated as light 19728,19730) redirects towards reflective image source 18720, at the same also by light 19728, 19730 converge on the imaging region in reflective image source 18720.In this display assembly, double light sources 19704,19708 It works in complementary fashion with polarization beam apparatus 19714 is folded, because polarization beam apparatus film 19714 is continuous.Therefore, it is showing The diffused light 19722,19724 not polarized is provided on every side of assembly 19702 respectively, and the first polarization state is (usually It is redirected by polarization beam apparatus film 19714 towards reflective image source 18720 for S), while there is another polarization state (usually P light 19740,19738) is transmitted by polarization beam apparatus film 19714.Transmitted light 19740,19738 with another polarization state By folding the bilateral of polarization beam apparatus film 19714, so that it arrives separately at diffuser 19712,19710 on opposite sides. When light 19740,19738 influences diffuser 19712,19710 on opposite sides respectively, light is reflected by diffuser, and in process In, what light became not polarized.Can to light source 19704,19708 and peripheral region addition reflector with increase to light 19740, 19738 reflection.This irreflexive light not polarized then with light source 19704,19708 provide diffused light 19722, 19724 mix on corresponding side, then pass back towards polarization beam apparatus film 19714, have first at polarization beam apparatus film The light 19730,19728 of polarization state is reflected towards reflective image source, and light 19738,19740 quilts with another polarization state Transmission, the process continuously repeat.Therefore, in this embodiment of the invention, the light of another polarization state is continuously to follow again Ring, thus increase the efficiency of display assembly 19702, because of light 19718,19720 provided by double light sources 19704,19708 Two polarization states be used to irradiate reflective image source 18720.The increase diffusing reflection of the light of recycling also improve to The uniformity for the irradiation light that reflective image source 18720 provides.Image light (19732,19734) can by directly to it is associated at As optical device provides.
With in Figure 197 provide and can be used in another embodiment in the similar method of process as described above, wherein Show that assembly has flat surfaces in each side for folding polarization beam apparatus film.In this embodiment, since reflectivity polarizes Each side of film is flat, and the light from side lamp keeps irradiate uniformity provided by diffuser.
In another embodiment that assembly is shown shown in Figure 197, solid film retainer can be used, wherein it is another partially The light of vibration state is recycled to improve efficiency.In this embodiment, each side of folding polarization beam apparatus film can be flat Or it is curved.
Figure 198 shows the diagram of the signal for manufacturing the method for headlight 19902 shown in such as Figure 199, the preceding lamps and lanterns There are the folding mirror beam splitter film 19808 and double light sources on each side.In Figure 198, double light sources are had been not shown, because they can To be a part of another assembling steps, or it is located in surrounding module.The flow chart of assemble method provides in Figure 20 4.? In this method, in step 20402, top 19810 and 19812 film retainer of bottom are provided.Top and bottom film is kept Device 19810,19812 can be made of diamond turning, injection molding, compression molding or grinding of any transparent material.Material and system The combination for making technology is selected to provide the 19812 film retainer of top 19810 and bottom with low-birefringence.It is kept for film The suitable low birefringence material of device 19810,19812 includes glass material or plastics, such as Zeon Chemicals company Zeonex F52R, Mitsui company APL5514 or Osaka Gas company OKP4.It will be connect in the film retainer of top and bottom Touching fold polarization beam apparatus film 19808 surface be matched with by film 19808 it is in place in be maintained at required shape and angle Without introducing significant air gap in degree, therefore image light can be hardly deflected by headlight 19902.In step 20404 In, bottom film retainer 19812 or by adhesive bonding or by provide by bottom film retainer 19812 be maintained at The surrounding structure of (either contact or in distance to a declared goal) is attached to reflexive figure in the relationship in reflective image source 18720 Image source 18720.In step 20408, polarization beam apparatus film is folded.Then in step 20410, polarization beam apparatus film is folded 19808 are placed in lower film retainer 19812, and upper membrane retainer 19810 is placed in top, thus forces polarization beam splitting Device film 19808 and the match surface of 19812 film retainer of top 19810 and bottom are conformal.Implement in the replacement of the method for the present invention In example, adhesive is applied in the surface of top 19810 or 19812 film retainer of bottom, so that polarization beam apparatus film 19808 It is adhered to top 19810 or 19812 film retainer of bottom.In step 20412, diffuser 19802,19804 is attached to Each side of 19812 film retainer of lower part.Being schematically represented in Figure 199 for headlight 19902 of assembling shows.Similar side can be used Method manufactures headlight shown in Figure 191,194 and 195.Within the scope of the invention, the order of assembling can be changed.
In the alternative embodiment of the above method, film retainer 19810,19812 be attached to diffuser 19802, 19804 or reflective image source 18720 or any other part before, film retainer 19810,19812 and fold polarization beam splitting Device film 19808 fits together.Then step 20402,20408 and 20410 carry out in order, be manufactured similarly to Figure 191, Inside shown in 194 and 195 has the solid film retainer for folding polarization beam apparatus film 19808.18720 He of reflective image source Diffuser 19802,19804 later, adhere to by (step 20404,20412).
Various methods can be used to be held in place reflexive beam splitter film between the film retainer of top and bottom In.Film in place can be adhered to top or bottom film retainer.Around top or bottom film retainer can be adhered to Structural member (not shown) or associated image forming optics (not shown).When reflexive beam splitter film is with wire-grid polarizer Polarization beam apparatus film when, if using adhesive on wire grid construction side, the performance of wire-grid polarizer may be damaged.? In such case, polarization beam apparatus film can be adhered to top on the opposite side of wire grid construction or bottom film retainer (depends on Which adjoins wire grid construction in top or bottom film retainer).It is used to polarization beam apparatus film being adhered to film retainer Adhesive must be transparent and low-birefringence.The example of suitable adhesive includes UV solidification adhesive or contact adhesive.
Figure 200-203 shows a series of signals diagram of the another method for manufacturing the headlight with bilateral lamp.Figure 20 5 It is flow chart the step of listing this method.In this method, it can be poured in the appropriate location around folding mirror beam splitter film Casting top and bottom film retainer.In step 20502, polarization beam apparatus film 20008 is folded.In step 20504, fold Polarization beam apparatus film 20008 is inserted into side frame, which there is groove or matching parts to keep polarization beam apparatus film 20008 In the required shape for headlight (double curved shape shown in 00 referring to fig. 2).In step 20508, side frame and then quilt It is attached to reflective image source 18720.In step 20510, diffuser 20002,20004 is attached to each side of side frame.This When, it folds polarization beam apparatus film 20008 and is surrounded on side by side frame and diffuser 20002,20004, and is anti-on bottom Penetrating property image source 18720 surrounds.Figure 200 shows the signal diagram in reflective image source 18720, and reflective image source 18720 has There are the diffuser 20002,20004 and independent unsupported reflexive beam splitter film 20008 of attachment, the reflectivity beam splitter film It is supported at edge, so that required shape is given reflexive beam splitter film 20008.
Figure 20 1 shows the hole in side frame or surrounding structure, this some holes be used to transparent cast material leading to folding mirror Under property beam splitter film.As shown, the biggish hole 20102 near reflective image source 18720 be used to introduce transparent pour Material is cast, and lesser hole 20104 is used to air to escape from folding mirror beam splitter film 20008 is lower.In this side In method, folding mirror beam splitter film 20008 forms closed cavity on reflective image source 18720, and the cavity is by diffuser 20002,20004 and side frame or surrounding structure included.When transparent casting resin is slowly injected into hole 20102, come self-enclosed The air of cavity is discharged from lesser hole 20104.When cavity is full, each section tap hole 20104 of transparent cast material, thus It prevents from forming pressure in the case where reflectivity divides beam splitter film 20008, which can make the shape distortion of film.Hole 20102 and 20104 is right After can be plugged to prevent transparent cast material from leaking out.
In step 20512, transparency liquid cast material 20202 is poured in 20007 top of polarization beam apparatus film, such as schemes Shown in 202.In step 20514, then apply transparent top piece or plate 20302 to provide flat top to material 20202, such as Shown in Figure 20 3.When applying the flat panel of transparent material to transparent cast material, it is necessary to carefully to prevent air from remaining in Under the flat panel of bright material.Plug can be set in surrounding structure, so that the flat panel of transparent material is kept and reflective image Source is parallel.
Transparency liquid cast material can be any transparency liquid cast material, such as epoxy resin, acrylic acid or urethane. Top-film retainer use transparency liquid cast material identical with bottom film retainer is coped with, therefore image light is exposed to The solid block of even optical thickness, and image light is not folded the surface deflections of polarization beam apparatus film.Transparency liquid cast material can After casting by allowing curing time, being exposed to UV or be exposed to heating power and can be cured.The solidification of transparent cast material It can be carried out in single step or multiple steps.Solidifying for the lower part as shown in Figure 20 1 can be the top shown in Figure 20 2 It is carried out before casting.Alternatively, entirely the solidification of casting headlight can carry out after the step shown in Figure 20 3.
It is the advantages of method shown in Figure 200-203: is obtained between transparent cast material and reflexive beam splitter film It must be in close contact, therefore light can unimpededly pass through each section of headlight.The casting method may be additionally used for solids top or bottom Film retainer, so that only top or bottom film retainer are cast.Although Figure 200-203, which shows to have manufactured, has curved surface Headlight, this method can also be used for manufacture have flat surfaces headlight.
In another embodiment, one of film retainer is manufactured to solid members, and another film retainer is divided with polarization is folded The casting of beam device film is in place.Folding polarization beam apparatus film can be glued before another film retainer is cast in suitable position Close solid members.By this method, the film retainer of casting will be with the close contact with the surface of polarization beam apparatus film.For solid The material of body film retainer should have refractive index identical with the film retainer cast, to avoid in image light self-reflection image Source deflects image light when passing to associated image forming optics.The example of appropriate matched material is from Bayer company APEC2000, which has 1.56 refractive index, and the available EpoxAcast690 injection molding from Smooth-On company, EpoxAcast690 has 1.565 refractive index and can cast.
In one more embodiment of the present invention, manufactured using the multistep molding process shown in the flowchart of such as Figure 20 6 Solid film retainer.In step 20602, bottom film retainer is molded.Suitable molding technique includes injection molding, compression molding Or casting.In step 20604, polarization beam apparatus film is folded.In step 20608, folds polarization beam apparatus film and be placed in On the bottom film retainer of molding, then it is placed in as insertion piece in the mold of top-film retainer.In step 20610, Then top-film retainer is molded in folds on polarization beam apparatus film and bottom film retainer.Final result is inside with all The solid film retainer of polarization beam apparatus film is folded as shown in Figure 191,194 and 195.The advantages of multistep molding technique, exists In, force folding polarization beam apparatus film and the surface of bottom film retainer conformal, and top and bottom film retainer and folding are inclined The beam splitter film that shakes is in close contact.In a preferred embodiment, the refractive index of top and bottom film retainer is identical, accidentally Difference is in 0.03.In a further preferred embodiment, top is used for for the glass transition point ratio of the material of bottom film retainer The glass transition point of the material of film retainer wants high, or the material for bottom film retainer is crosslinked, and is made to obtain and roll over When moulding top-film retainer on folded polarization beam apparatus film and bottom film retainer, bottom film retainer will not deformation.It can be molded One example of the appropriate combination of material is cyclic olefin material, and such as Tg is 139C and refractive index is 1.53 from Zeon The Zeonex E48R and Tg of Chemicals company are 177C and refractive index is 1.53 from Topas Advanced The Topas6017 of Polymers company.
It is appreciated that some embodiments of AR eyepiece of the invention have permit before irrealizable resolution levels and Such as the various combined high modulation transfer functions of the equipment sizes such as frame thickness.For example, in some embodiments, Xiang Yong The virtual image pixel resolution rank that family is presented can be in the range of about 28 to 46 pixels of every degree.
With reference to Figure 105 A to C, it is bent the direction of the angle control image light of wire-grid polarizer.It is bent the song of wire-grid polarizer The width of line traffic control image light.Curve allows using narrow light source, because it spreads light, then when light hits curve Light/reflected light is folded with uniform irradiation image display.Upset is not affected by by the image light that wire-grid polarizer is passed back.Cause This, curve also allows the miniaturization of optics assembly.
In Figure 21-22, augmented reality eyepiece 2100 includes frame 2102 and left and right mirror pin or temple part 2104.Such as bullet The protectiveness lens 2106 of road lens are installed in front of frame 2102, with protect user eyes or they be prescription User is corrected in the case where lens to the view of ambient enviroment.The front of frame may be additionally used for installing camera or imaging sensor 2130 and one or more microphone 2132.Invisible in Figure 21, waveguide is installed in protectiveness lens in frame 2102 After 2106, there is one on every side of center or the scalability bridge of the nose 2138.Front cover 2106 can be it is replaceable so that can Easily change color or prescription for the specific user of augmented reality equipment.In one embodiment, each lens can rapidly more It changes, to allow to have different prescriptions for each eyes.In one embodiment, lens are available as discussed elsewhere herein Fastener is replaced rapidly.Some embodiments only can have projector and waveguide combination in the side of eyepiece, and the other side is available normal Advise the fillings such as lens, reading lens, prescription lens.Left and right mirror pin 2104 respectively can by projector or micro-projector 2114 or its Its image source is vertically mounted on the top of hinge 2128 of loading spring, for easier assembly and vibration/shock protection.Often One side support member further includes the temple part shell 2116 for installing the associated electronics of eyepiece, and respectively may also include elasticity Pad 2120 is held on head, for the better maintenance of user.Each temple part further includes extension circulating earphone 2112 and is used for The aperture 2126 of mounting head headband 2142.
As noted, temple part shell 2116 includes electronic device associated with augmented reality eyepiece.Electronics device Part may include several circuit boards as shown, such as the circuit board 2122 of microprocessor and radio, for chip-on communication The circuit board 2124 of system (SOC) and open multimedia application processor (OMAP) processor plate 2140.Chip-on communication system (SOC) may include electronic device for one or more communication capacities, communication capacity include wide local area network (WLAN), BlueToothTMCommunication, frequency modulation(PFM) (FM) radio, global positioning system (GPS), 3 axis accelerometers, one or more gyros Instrument etc..In addition, right temple pieces can include optical touch board (not shown), the control for user to eyepiece on the outside of temple part System and one or more application.
In one embodiment, digital signal processor (DSP) can be programmed and/or be configured to receive video feed information, And by video feed be configured to that driving passes through that optical display uses no matter the image source of which kind.DSP may include Bus or other communication mechanisms for the communication information, and it is coupled to bus the internal processor of processing information.DSP can Including being coupled to bus for storing information and instruction to be executed memory, such as random-access memory (ram) or its Its dynamic memory (for example, dynamic ram (DRAM), static state RAM(SRAM) and synchronous dram (SDRAM)).DSP may include coupling Close bus for for internal storage storage static information and instruction nonvolatile memory, such as read-only memory (ROM) or other static storage devices are (for example, programming ROM (PROM), erasable PROM (EPROM) and electrically erasable PROM (EEPROM)).DSP may include special purpose logic devices (for example, specific integrated circuit (ASIC)) or configurable logic device (example Such as, simple programmable logical device (SPLD), Complex Programmable Logic Devices (CPLD) and field programmable gate array (FPGA)).
DSP may include at least one computer-readable medium or memory, for keeping programmed instruction and being used for Include data structure, table, record or other data needed for driving optical display.It is suitable for the invention computer-readable Jie The example of matter can be compact-disc, hard disk, floppy disk, tape, magneto-optic disk, PROM(EPROM, EEPROM, dodge EPROM), DRAM, SRAM, SDRAM or any other magnetic medium, compact-disc (for example, CD-ROM) or any other optical medium, card punch, Paper tape or any other physical medium, carrier wave (being described below) or computer with sectional hole patterns can therefrom read any other Medium.Realize one or more sequences of one or more instructions for optical display so as to execute can be related to computer can Read the various forms of medium.DSP may also include communication interface, with provide to can connected such as Local Area Network network The data communication of the alternative communication network of link or such as internet couples.Wireless link can also be implemented.Any such In realization, suitable communication interface is transmittable and receives carrying to various types of information (such as videos of optical display expression Information) the electricity of digit data stream, electric magnetically or optically signal.
Eyepiece is able to carry out the context-aware capture to video, and movement adjustment video of this capture based on viewer is caught Parameter is caught, wherein parameter can be image resolution ratio, video compression ratio, frame per second per second etc..Eyepiece can be used for various video and answer With such as record is by integrating video that is camera shooting or sending from external video equipment, passing through eyepiece to wearer's playback Video (passing through methods described herein and system), stream transmission come from external source (for example, Conference Calling, live news feedback Source, the video flowing from another eyepiece) or from the live video for integrating camera (for example, from integrated non-line-of-sight camera) Deng.In embodiments, eyepiece can accommodate the multiple Video Applications that can be once presented to wearer, for example check Streamed external video link, while the video file that playback is stored on eyepiece.Eyepiece can provide 3D viewing experience, such as By providing image to any one eye, or alternatively, offer simplifies 3D experience, such as provides quantity to one of two eyes The content of reduction.Eyepiece can provide Text enhancement video, such as cannot hear included sound when audio conditions are too noisy When frequency, audio using foreign language for a user when, user is when wanting the transcription of record audio etc..
In embodiments, eyepiece can provide context-aware Video Applications, such as at least one of adjustment video capture Parameter and/or checking according to the environment of wearer.For example, wearer's focal attention external environment can need and non-video Video is presented to the wearer of eyepiece by eyepiece in the context of external environment, wherein at least one parameter is so as to present less Distracting mode adjusts presented video (for example, the adjustment of spatial resolution;The adjustment of number of pictures per second;With indicating to regard The still image replacement video of frequency content is presented, photo that all people in this way of still image are stored, the single frame from video) Deng.In another situation, it can moved in wearer in the context of (for example, walking, running, cycling, driving) by eyepiece On integrated cameras capture video, the wherein at least one parameter video that is capturing of adjustment with help to adapt to movement (for example, It makes adjustment during the quick movement that eyepiece sensing video will obscure, make tune during wearer just slowly walks or moves It is whole).
In embodiments, which can be spatial resolution parameter (for example, by the pixel in region, pressing The pixel of the specified color in region is limited to by region only single (' black and white ') pixel), the visual field, the frame temporally recorded, on time Between frame, data compression rate, the period for not being recorded/presenting etc. that present.
In embodiments, which can be adjusted based on the input that eyepiece is sensed, and such as basis is used for Determine head movement (for example, with the quick head movement of determination, slow head movement) motion detection input (as described here), It is used to determine that surrounding's video of the relative motion between wearer and environment to be caught via the received image of integrated camera by handling The eye movement (as described here) of the movement of environment or movement, wearer in environment is caught to determine wearer whether by just The video presented to wearer diverts one's attention, environment light and/or sound conditions etc..
In embodiments, eyepiece can about reduce influence to the quality of the video tastes of wearer of movement or environment or Influence to the quality of the video stored when capturing video provides image procossing, such as compensate slight movement, Jump, quickly movement;Background illumination and/or acoustic environment are adjusted, by adjusting color mixture, brightness etc..To processing Selection can be according to input, environmental aspect, the video content etc. sensed.For example, preference high quality graphic in some cases, with So that in some cases, the reduction of quality is unacceptable, therefore video may be suspended in these cases.Another In one situation, when determining that situation interferes the capture of acceptable quality level but there is still a need for when certain continuity of capture, video and/ Or audio compression can be applied.Processing can be also differently applied to each eyes of eyepiece, such as about the leading of wearer Eye compares varying environment situation that another eye is experienced etc. about an eyes.Processing can compensate for bright light environments, wherein embedding Enter formula sensor and is examined for ambient light level to carry out the possibility shown to content adjustment, such as so as to true based on environment Surely the compression of what color channel and/or manipulation, modification color curve/palette to be executed so as to more visible or more relative to environment Invisible, change color depth, changes how to compress color etc. at color curve.
In embodiments, as sensed situation as a result, eyepiece can start a movement, such as when for example more than eyepiece If predetermined amount of movement etc. more than a condition when go to screenshot capture mode while continuing the audio-frequency unit in video, movement will Predetermined quality level degradation is set then to stop shooting video, the triggering video presentation when sports level is exceeded in the received video of institute In change etc..
In embodiments, as reception control signal as a result, eyepiece can start a movement.Control signal can be based on mesh The interior perhaps user gesture that the position of mirror, eyepiece are currently viewed.Movement can be the video captured by eyepiece to depositing The upload or downloading that storage space is set.Movement should can only control the reception of signal itself or the confirmation control of control signal and user's starting The reception of signal processed and be activated.Act the designated position that can be moved in the video just shown by glasses, to just by eye The designated position in video that mirror is shown adds the starting of the process of bookmark etc..
In embodiments, as sensing situation result carry out adjustment can by user preference, organizational policy, state or Federal regulations etc. control.For example, a preference, which can be, is always to provide a certain quality, resolution ratio, compression ratio etc., no matter sensed Input what is indicated.
In one example, the wearer of eyepiece can be at wherein their head and therefore the integrated camera of eyepiece in mesh In the environment quickly shaken while mirror recorded video.In this case, eyepiece can adjust at least one parameter and be caught with reducing The degree for shaking video is caught, is such as increased and is applied to the compression ratio of video, reduces the frame number captured according to the time period (for example, every Capture a frame within several seconds), be discarded between each frame in the picture the frame with great change, reduce spatial resolution etc..
In one example, the wearer of eyepiece may pass through eyepiece using video conference, and wherein eyepiece is passed by movement Sensor senses wearer and is moving.As a result, during this movement, the video feed of the replaceable participant of still image, The image of such as one of other participants or be such as sent to other members user image.It by this method, can be to wearer And/or other participants in video conference reduce influence distracting caused by wearer's movement.
In one example, wearer may watch video and then start to drive, if wearer continues viewing currently The video of display then this be likely to become safety problem.In this case, the motion detection of environment can be able to be instruction by eyepiece In automobile, and viewing experience is changed to it is less distracting, such as if the eye movement of wearer indicates user The case where just quickly alternately being changed in sight (steering direction) or the visual field after automobile and between shown video. Eyepiece can for example stop video, and the option continued is provided to viewer.Eyepiece can also sensed environmental movement to be in It in automobile, bicycle, distinguishes between walking etc., and correspondingly adjusts.
In one example, no matter wearer be in the car, on bicycle, walking or it is other etc. whens may need to assist It helps to navigate to a position.In this case, eyepiece will show video navigation application to user.What eyepiece was shown to user leads Boat instruction can be selected by control signal.Control signal can be by being currently displaying in position that wearer specifies, glasses in Hold or the wearer destination said generates.Position can be meal/drink, education, event, exercise, family, outdoor, retail shop, One of communications and transportation position etc..
In one example, wearer may capture video, and wherein ambient enviroment is distracting or in one aspect The quality for reducing video, due to color contrast, mixing, depth, resolution ratio, brightness etc..Eyepiece can be directed to wearer It is adjusted in outdoor comparison situation indoors, under different illumination conditions, unfavorable audio conditions are inferior.At this In kind situation, eyepiece can adjust recorded image and sound to obtain as to the more efficiently of the content captured The video product of expression.
In embodiments, eyepiece can provide external interface to computer peripheral, all monitors in this way of peripheral equipment, Display, TV, keyboard, mouse, memory store (for example, external fixed disk drive, optical drive, solid-state memory), net Network interface (for example, to network interface of internet) etc..For example, external interface can provide to the straight of external computer peripheral Lead to (for example, being connected directly to monitor) in succession, to the indirect communication of external computer peripheral (for example, outer by center Portion's peripheral interface equipment), by wired connection, by be wirelessly connected etc..In one example, it is external can be connected to offer for eyepiece The central external peripheral interface equipment of the connection of portion's peripheral equipment, wherein outer peripheral interface equipment may include computer interface work Tool, such as computer processor, memory, operating system, peripheral driver and interface, USB port, external display interface, net Network port, speaker interface, microphone interface etc..In embodiments, eyepiece can pass through wired connection, wireless connection, directly place Central external peripheral interface is connected in bracket is medium, and when attached, can be equipped with to eyepiece similar with personal computer Or identical calculating instrument.In embodiments, it is selected for that eyepiece can be looked at by user by the equipment that eyepiece controls, refers to To eyepiece, from the user interface shown on eyepiece selection etc. select.In other embodiments, eyepiece can be checked in user and be set The user interface of the equipment is shown when standby or sensing equipment.
Frame 2102 is in the general shape of the wound sunglasses of an inner loop.The two sides of glasses include marmem head Band 2134, such as Nitinol headband.Nitinol or other shapes memorial alloy headband are suitble to the user of augmented reality eyepiece.Headband It is customized, so that its training or preferred shape is presented when they are worn by the user and are heated to close to body temperature. In embodiments, the suitable of eyepiece can provide eyes of user width technique of alignment and measurement.For example, the display projected for The position of the wearer of eyepiece and/or to appropriate location will definitely be adjusted to, to adapt to the various eye widths of different wearers. Positioning and/or alignment can be automatically, such as by the position via optical system detection wearer's eyes (for example, iris Or pupil detection), or manually, carried out by wearer etc..
The other feature of this embodiment includes detachable Noise canceling headsets.As can be seen from the drawing, earphone is intended to connect The control of augmented reality eyepiece is connected to transmit sound to the ear of user.Sound may include the nothing from augmented reality eyepiece The input of line internet or telecommunication capability.Earphone may also include it is soft, can deformation plastics or foam sections so that with class The mode for being similar to earplug protects the inner ear of user.In one embodiment, the input of the ear to user is limited to about by earphone 85dB.This allows normally listening to for wearer, at the same provide for gunslinging noise or other explosive noises protection and It is listened in high background noise environment.In one embodiment, the control of Noise canceling headsets can have automatic growth control, be used for Carry out eliminating the very quickly adjustment of feature when protecting wearer's ear.
Figure 23 depicts the layout of the projector 2114 of the vertical arrangement in eyepiece 2300, and wherein irradiation light is at it to can be with It is in the display of silicon backboard and the way of imager plate Down-Up by the side of PBS, and hits and constitute when irradiation light It is refracted as image light when the inner boundary of the triangular prism of polarization beam apparatus, and reflects from projector and to enter waveguide saturating Mirror.In this example, width of the dimension of projector with the imaging plate of 11mm, one end from imaging plate to picture centre line It the distance of 10.6mm and is shown from picture centre line to the distance of the about 11.8mm of LED board one end.
Detailed and assembling the view of visible each component to projector discussed above in Figure 25.This view depicts work as When micro-projector 2500 is assembled near such as hinge of augmented reality eyepiece, how compact micro-projector 2500 have.Micro- projection Instrument 2500 includes shell and the retainer 2508 for installing certain optical elements.Because each colour field is by optical display 2510 Imaging, corresponding LED color are opened.RGB LED light engine 2502 is depicted near bottom, is mounted on radiator 2504 On.Retainer 2508 is installed on the top of LED light engine 2502, and retainer is mounted with opticaltunnel 2520, diffusing lens 2512 (eliminate hot spot) and condenser lens 2514.Light passes to polarization beam apparatus 2518 from condenser lens, and it is saturating to be then passed into field Mirror 2516.Then light is refracted to LCoS(liquid crystal over silicon) on chip 2510, image is formed there.Light for image is then logical It crosses field lens 2516 to be reflected back toward, and is polarized and reflected 90 ° by polarization beam apparatus 2518.Light be then departed from micro-projector with Just it is emitted to the optical display of glasses.
Figure 26 depicts exemplary RGB LED module 2600.In this example, LED be with 1 red, 1 blue and The 2x2 array of 2 green dies, and LED array has 4 cathodes and a general anode.Maximum current can be often A tube core 0.5A, and maximum voltage (≈ 4V) may be needed to green and blue tube core.
In embodiments, system is shown using the optical system that can generate monochromatic display to wearer, the monochrome Benefit can be provided to image definition, image resolution ratio, frame per second etc..For example, frame per second can be three times (compared with RGB systems), And this may night vision or it is similar in the case where it is useful, camera is imaged to around in night vision or similar situation, wherein this A little images can be processed and be shown as content.Image is likely more bright, and such as three times are bright if having used three LED It is bright, or space is saved only with a LED.If having used multiple LED, they can be, and homochromy or they can To be not homochromy (RGB).System can be changeable monochrome/color system, and which use RGB, but when wearer thinks When wanting monochrome, individual LED or multiple LED either can choose.All three LED can be used simultaneously, and non-sequential be made With to obtain white light.It is non-sequential any other in the case of three times using that may rise such as wherein frame per second using three LED White light." switching " between monochromatic and colour can " manual " (for example, physical button, gui interface select) carry out, or switching can It is carried out automatically depending on the application being currently running.For example, wearer is possibly into night vision mode or fog dispersal mode, and system Processing part automatically determine eyepiece and need to enter monochromatic high refresh rate mode.
Fig. 3 depicts the embodiment of the projector of the horizontal setting in use.Projector 300 can be placed in eyepiece frame In arm.LED module 302 under processor control 304 can once be emitted single color according to rapid serial.The light being emitted can It is travelled downwardly in opticaltunnel 308, and by least one before encountering polarization beam apparatus 312 and being deflected towards LCoS display 314 A lenslet 310 that homogenizes, full color images are shown at LCoS display 314.LCoS display can have 1280x720p Resolution ratio.Then image can be reflected back toward by polarization beam apparatus, from folding the reflection of eyeglass 318 and leave projector at it It is advanced through collimator on the way, and enters waveguide.Projector may include the diffraction element for eliminating aberration.
In one embodiment, interactive wear-type eyepiece includes optics assembly, and user is checked by the optics assembly The content of ambient enviroment and display, wherein optics assembly include correct user to the corrective element of the view of ambient enviroment, Allow the optical waveguide of the free form surface of internal reflection and be arranged to will be from the figure of such as optical display of LCoS display Coupled lens as being directed to optical waveguide.Eyepiece further includes for process content so that the one or more shown to user is integrated Processor, and the integrated image source for being used to introduce content optics assembly of such as projector tool.Image source is wherein In each embodiment of projector, projector tool includes light source and optical display.The light of light source from such as RGB block exists It is emitted under the control of processor, passes through polarization beam apparatus, it is polarized there, later from such as LCoS display or certain The optical display of LCD display in other embodiments reflects, and enters optical waveguide.The surface of polarization beam apparatus can will come from The color image of optical display is reflected into optical waveguide.RGB LED module can sequence emergent light to be formed from optical display The color image of reflection.Corrective element can be perspective correcting lens, it is attached to optical waveguide, to allow to ambient enviroment Correctly check regardless of image source is on or off.This corrective element can be wedge-shaped correcting lens, and can be with It is prescription, coloring, coating etc..It can may include allowing waveguide by the optical waveguide for the free form surface that higher-order multinomial describes Curvature and size adjusting double free form surface surfaces.The curvature and size adjusting of waveguide allow it to be placed in interactive wear-type In the frame of eyepiece.This frame can be sized to fit the head of user in the way of being similar to sunglasses or glasses Portion.Other elements of the optics assembly of eyepiece include homogenizer and collimator, the light by the homogenizer from light source It is transmitted to ensure that light beam is uniform, and collimator improves the resolution ratio for entering the light of optical waveguide.
In embodiments, prescription lens may be mounted to that the inner or outer side of eyepiece lens.In some embodiments, locate Can power may be logically divided into the prescription lens being mounted on the outside and inside of eyepiece lens.In embodiments, prescription corrections are It is provided by corrective optical device, corrective optical device such as depends on eyepiece lens or optics group by surface tension One component of piece installing, such as beam splitter.In embodiments, corrective optical device can be partially disposed one of optical path In position, and it is partially disposed in another position of optical path.For example, the half of corrective optical device may be disposed at point The outside of the convergence plane of beam device, and the other half is arranged on the inside of convergence plane.It by this method, can be to the image from inside sources Light and scene light differently provide correction.That is, the light from source can be only in being corrected property optical device on the inside of the convergent lens on Partial correction because image is reflected to the eyes of user, and scene light can be corrected by two parts, because light passes through Beam splitter transmitting, is consequently exposed to different optical corrections.In another embodiment, optics assembling associated with beam splitter Part can be the assembly of sealing, make assembly waterproof and dustproof etc., wherein the inner surface of sealing optics assembly has One part of corrective optical device, and the outer surface for sealing optics assembly has another portion of corrective optical device Point.Suitable optical device can be provided by the Press-On Optics of 3M company, which at least can be used as prism (that is, luxuriant and rich with fragrance alunite Ear prism), aspherical subtract lens, aspherical plus lens and bifocal lens.Corrective optical device can be user it is removable and Interchangeable correction of refractive errors tool, it is suitable between the eyes of user and shown content to be adapted to be movably attached to In position, so that eyesight of the correction of refractive errors tool about shown content and ambient enviroment correction user.Correction of refractive errors work Tool may be adapted to be installed to optics assembly.Correction of refractive errors tool may be adapted to be installed to wear-type eyepiece.Correction of refractive errors tool can make It is suitble to friction to install.Magnetic attachment tool can be used to install for correction of refractive errors tool.Depending on the eyesight of user, Yong Huke It is selected from multiple and different correction of refractive errors tools.
In embodiments, the corrective optical device that the present invention can provide ' buttons ' on eyepiece, such as wherein user can Mobile and interchangeable correction of refractive errors tool is adapted between the eyes for being movably attached to user and shown content Suitable position in so that correction of refractive errors tool about shown content and ambient enviroment correction user eyesight.Dioptric Correction tool may be adapted to be installed to optics assembly, wear-type eyepiece etc..Correction of refractive errors tool friction can be used to be suitble to, be magnetic attached Tool etc. install.Depending on the eyesight of user, user can select from multiple and different correction of refractive errors tools.
With reference to Fig. 4, it can be polarized and the image light of collimation optionally passes through display coupled lens 412 and enters waveguide 414, display coupled lens itself can be or can not be collimator or be additional to collimator.In embodiments, wave Leading 414 can be free form surface waveguide, and wherein the surface of waveguide is described by Polynomial Equality.Waveguide can be straight.Wave Leading 414 may include two reflective surfaces.When image light enters waveguide 414, it can hit first surface with an incidence angle, should Incidence angle is greater than the critical angle that total internal reflection (TIR) occurs.Image light may participate between first surface and the second surface on opposite TIR rebound, the activity for eventually arriving at compound lens checks region 418.In one embodiment, light may participate at least three times TIR Rebound.Since waveguide 414 is tapered, to allow the eventually off waveguide of TIR rebound, the thickness of compound lens 420 may not be equal Even.The distortion of compound lens 420 checked in region can be saturating by the length placement wedge shape correction along free form surface waveguide 414 Mirror 410 minimizes in order to provide the uniform thickness checked on region of at least lens 420.Correcting lens 410 can be prescription Lens, the lens of coloring, polarized lens, trajectory lens etc. are installed in the inner or outer side of eyepiece lens, or in some realities It applies in example, mounted then the inside of eyepiece lens and outside.
In some embodiments, although optical waveguide can have the first surface of the total internal reflection for the light for allowing access into waveguide And second surface, light actually will not may enter waveguide with the inside incidence angle that will lead to total internal reflection.Eyepiece can be in light wave It include specular surface on the first surface led, with the content shown towards the reflection of the second surface of optical waveguide.Therefore, mirror surface table Face allows access into the total reflection of the light of optical waveguide or enters at least part of reflection of the light of optical waveguide.In each embodiment In, surface can be 100% mirror surface or small percentage mirror surface.In some embodiments, instead of specular surface, waveguide and Air gap between corrective element can cause not will lead to the reflection that the incidence angle of TIR enters the light of waveguide.
In one embodiment, eyepiece includes the integrated image source of such as projector, and the image source is from the light for adjoining eyepiece arm Waveguide introduces content to show to optics assembly.The prior art carried out with the injection of wherein image from the top side of optical waveguide Optics assembly is different, and the present invention is provided from the side of waveguide injects the image of waveguide.The aspect ratio of displayed content is big Cause rectangular between substantially rectangular, which has approximate horizontal long axis.In embodiments, the aspect ratio of displayed content It is 16:9.In embodiments, can by via injection image rotation come realize wherein long axis it is approximate horizontal shown by The rectangular aspect ratio of appearance.In other embodiments, it can be realized by stretching image until it reaches required aspect ratio.
Fig. 5 depicts the design for showing the waveguide eyepiece of sample dimension.For example, in this design, coupled lens 504 Width can be 13~15mm, the continuously optical coupling of optical display 502.These elements can be placed in an arm of eyepiece, Or it is redundantly placed in two arms of eyepiece.Image light from optical display 502 projects to freedom by coupled lens 504 The waveguide 508 of curved surface.The thickness of compound lens 520 including waveguide 508 and correcting lens 510 can be 9mm.In this design In, 8mm exit pupil diameter that waveguide 502 allows to have 20mm gaps.Obtained see-through view 512 is about 60-70mm.From pupil Hole to image light enters the distance in image light path of waveguide 502, and (dimension a) is about 50-60mm, this is suitable for big percentage Human tau it is wide.In one embodiment, the visual field can be bigger than pupil.In embodiments, the visual field may not fill up lens.It answers Understand, these dimensions are and to be not necessarily to be construed as limiting for specific illustrative embodiments.In one embodiment, waveguide, It buckles optical device and/or corrective lens may include optical plastic.In other embodiments, waveguide, buckle optical device and/ Or corrective lens may include glass, marginal glass, bulk glass, glassy metal, palladium strengthened glass or other suitable glass. In embodiments, waveguide 508 and corrective lens 510 can be by being selected to hardly lead to the different materials system of chromatic aberation At.Material may include diffraction grating, holographic grating etc..
It is all as in the embodiment shown in fig. 1 in, when two projectors 108 are used for left images, the figure that is projected As can be stereo-picture.In order to allow solid to check, projector 108 can be placed in Adjustable Range each other, which allows It is adjusted based on the interpupillary distance of each wearer of eyepiece.For example, single optics assembly may include have for horizontal, vertical and Two independent electro-optical modules of the various adjustment of sloped position.Alternatively, optics assembly can only include single electrooptics mould Block.
Figure 146 to 149 schematically shows augmented reality (AR) eyepiece 14600(without its temple part) an embodiment, The placement of middle image can be adjusted.The front and back perspective view of AR eyepiece 14600 is shown respectively in Figure 146 and 147.In this reality It applies in example, the electronic device and each section (being referred to as 14602) of optical projection system are located above lens 14604a, 14604b.AR mesh There are two projection screen 14608a, 14608b for the tool of mirror 14600, they are adjustably from the wearer of lens 14604a, 14604b Adjustment platform 14610 on side is hung.It is equipped on adjustment platform 14610 only for the beam 14612 relative to AP eyepiece 14600 The inclined mechanism of vertical adjustment lateral position and each projection screen 14608a, 14608b.
Structure for adjusting the position of one or two of display screen can be by manual actuation (for example, via button) Or motor, manual control equipment (thumb wheel, lever arm etc.) or motorization and both the manual equipments of software activation Combination is to control.AR eyepiece 14600 uses manual equipment, these equipment will will now be described.Those skilled in the art will manage Solution, adjustment mechanism are designed to make to laterally adjust to decouple with tilt adjustments.
Figure 148 shows the perspective dorsal view of the part on the left of the wearer of AR eyepiece 14600, wherein adjustment platform 14610 The upper adjustment mechanism 14614 for projection screen 14608a is perhaps shown more clearly.Projection screen 14608a is installed in frame On 14618, frame 14618 is fixedly attached at movable carriage 14620(or part of it).In its 14612 side of beam, bracket The 14620 carrying axis 14622 that can be attached in first piece 14624 of arc groove of adjustment platform 14610 are rotatable and can slide It supports dynamicly.In its temple part side, bracket 14620 can rotationally and slidably be supported by yoke 14628.With reference to Figure 150, yoke 14628 have shaft portion 14630, it is fixedly attached to bracket 14620 and coaxial with axis 14622 is carried, with to bracket 14620 provide rotary shaft.Yoke 14628 be attached to adjustment platform 14610 the second supporting block 14632(see Figure 151) arc It is slidably and rotatably supported in slot.
Yoke 14628 also has from radial outwardly extending two parallel arms 14634a, 14634b of shaft portion 14630.Often The free end of one arm 14634a, 14634b has hole, such as the hole 14638 of arm 14634b, for fixedly capturing axis therebetween 14678, (see Figure 149) as described below.Arm 14634a has anchor portion 14640, it is attached to the axle portion of yoke 14628 there 14630.Anchor portion 14640 has the through-hole 14642 for slidably capturing bolt 14660, (see Figure 152) as described below.
Referring again to Figure 148, adjustment mechanism has the first thumb for controlling the lateral position of projection screen 14608a Spinning roller 14644 and inclined second thumb wheel 14648 for controlling projection screen 14608a.First thumb wheel 14644 extend partially by the groove 14650 of adjustment platform 14610, and can be engaged threadedly by the first thread spindle 14652 And it supports.First thread spindle 14652 is slidably supported in the through-hole in the third and fourth supporting block 14654,14658. Third and fourth piece 14654,14658 and/or groove 14650 side for preventing 14644 transverse shifting of the first thumb wheel. Therefore, thumb wheel 14644(is rotated about their axes to be indicated by arrow A) cause 14652 transverse shifting of the first thread spindle (by arrow B Instruction).As best in Figure 152 as it can be seen that the first thread spindle 14652 has from the radial outwardly extending bolt 14660 in the side Qi Liang. (it is noted that the screw thread of the first thread spindle 14652 is depicted not in the drawings, but it can be single or multiple pitch threads.) bolt 14660 are slideably captured by the vertical vertical through hole 14642 in the anchor portion 14640 of the arm 14634a of yoke 14628.When the first thumb revolves Wheel 14644 is when being transferred to the direction for causing the first thread spindle 14652 laterally to advance towards beam 14612, and bolt 14660 is towards through-hole 14642 14612 side of beam push, this makes yoke 14628, bracket 14620, frame 14618 and the first projection screen 14608a whole again Towards 14612 transverse shifting of beam (see arrow C).Similarly, the first thumb wheel 14644 is gone to opposite direction leads to the first throwing Shadow screen is far from 14612 transverse shifting of beam.
Second thumb wheel 14648 be used to control the first projection screen 14608a around bracket axis 14622 and yoke axle part The inclination of axis defined in 14630.Referring now to Figure 153, the second thumb wheel 14648 is fixedly attached to hollow flange axis 14664 narrow portion 14662.The screw spindle part that screw thread receives eye hook 14672 can be used in the flange section 14668 of flange shaft 14664 Divide 14670.(it is noted that the screw thread of threaded shank portion 14670 is depicted not in the drawings, but it can be single or multiple pitches Screw thread.) in use, the narrow portion 14662 of flange shaft 14664 is rotatably by adjusting the countersunk hole in platform 14610 14674(is shown in Figure 151) so that thumb wheel 14648 is on the bottom side of adjustment platform 14610, and eye hook 14672 is in top side On, and the flange section 14668 of flange shaft 14664 immerses oneself in be captured in part in countersunk hole 14674.Referring again to figure 149, the eye of eye hook 14672 is slidably engaged around axis 14678, and axis 14678 is in the free end of yoke arm 14634a, 14634b Hole in be captured.Therefore, around its axis rotate the second thumb wheel 14644(as indicated by arrow D) cause flange shaft 14664 with It is rotated together with, this causes the threaded shank portion 14670 of eye hook 14672 vertically to pass in and out movement (such as relative to flange section 14668 Indicated by arrow E), this causes the eye of eye hook 14672 to be pushed towards axis 14678, and this causes yoke 14628 mobile around its axis, because This causes the first projection screen 14608 to tilt (as indicated by the arrowsf) facing away or facing towards wearer.
Referring again to Figure 148, it is noted that the electronic device and each section of optical projection system 14602a, which is located at, is fixed on bracket On the platform 14680 at 14620 top.Therefore, projection platform 14608a and its optical projection system 14602a associated there Electronic device and partial spatial relationship keep substantially not any laterally or diagonally adjusting to projection platform 14608a progress Change.
AR glasses 14600 further include the adjustment mechanism similar to the adjustment mechanism 14614 described just now, are used for located lateral With the second projection screen 14608b on the wearer right side for being obliquely positioned at AR eyepiece 14600.
In one embodiment, eyepiece may include that inclination or curved guide rail, the guide rail for IPD adjustment make optical mode Block is more maintained in curved frames.In some embodiments, display can be used for being connected to such inclination or curved Guide rail.
In embodiments, one or more display screens of AR eyepiece are arranged to be parallel to the line of connection eyes of user. In some embodiments, one or more display screens around its vertical axis rotate so that they close to that end of nose from even The line for connecing the eyes of user starts to rotate inward with the angle in about 0.1 to about 5 degree of range towards eyes in parallel, i.e., " toes to It is interior ".These latter embodiments it is some in, the inside angle of toes is permanently fixed, and in other embodiments, toes Inside angle is that user is adjustable.User can adjust embodiment it is some in, adjustability is limited to two or more Predeterminated position, such as indicate those of closely convergence, middle distance convergence and remote convergence position.In other embodiments, it can adjust Property is continuous.It preferably, is wherein further including each embodiment of the modified AR glasses of automatic vergence as disclosed here In, in-toed amount is taken into account in vergence amendment.It is inward from wherein toes in constant each embodiment, toes Inside amount, which can be included directly in automatic vergence amendment, can adjust embodiment without position sensor, but in user In, it is preferred to use position sensor is to transmit existing in-toed amount to processor so as in vergence corrected Calculation It uses.It is to adjust or can be performed manually by the adjustable each embodiment of user, such as lead in the inside angle of wherein toes Cross to use directly or for example selectively enables one or two display screen rotate around its vertical axis indirectly by drive chain Deflecting roller, or can be motorized to complete selectable rotation when being activated by user by user interface or control switch Turn.
In some cases, toes inwardly adjust can be used for during the eyes of user be maintained at the long activity of particular focal length Ession for telecommunication (for example, when reading, watching monitor, ball match or horizon) loosens the eyes of user.Above-mentioned toes are inside Adjustment can be used, with will pass through effectively rotational display screen come preferably with the eye alignment of user and be user interpupillary distance into Row adjustment.
In embodiments, the present invention provides mechanical interpupillary distance adjustment, and such as wherein be suitable for can be by for the optics assembly of eyepiece User adjusts position in frame, so that user has the ability of the position of the eyes change optics assembly about user. Position adjusts controllable horizontal position, upright position, inclination etc. of the optics assembly in spectacle-frame.
In embodiments, the present invention can provide the adjustment of digital interpupillary distance, and such as wherein integrated processor executes pupil alignment Process, the process allow user to adjust the placement location in the visual field that displayed content is presented on eyepiece optics assembly, Pupil alignment calibration factors are arranged so as to the use in the placement of other display contents.Calibration factors may include shown interior Hold level and/or vertical adjustment in the visual field.Calibration factors may include multiple calibration factors, respectively indicate to arrive real object Distance will use range calibration factor when the calculating based on the distance to real object positions content in the visual field.Calibration because Element may include the calibration process based on multiple calibration factors, and calibration factors respectively indicate the distance to real object, when based on arriving The calculating of the distance of real object will use range calibration factor when positioning content in the visual field.The positioning of image can be in display On be adjusted in the visual field it is mobile it.Being moved apart two images will make the object for seeming imaging separate, and incite somebody to action Image is moved into being close together so that object seems close.For the difference of position of the object in the visual field of each eyes It is different to be referred to as parallax.Parallax and the object perceived leave the distance dependent of user.
Referring now to Figure 173, the decomposition view of glasses is depicted.Electronic device 17302 is located at superciliary glasses In front frame, including CPU, display driver, camera, radio, processor, user interface etc..Optical module 17308 is using covering They can optionally lens 17304 be attached to frame.Lens 17304 can be coloring or colorable.Solid is shown herein Embodiment, it should be appreciated that single optical module 17308 can also be used.Electronic device 17302 is sealed with lid 17314, lid packet The user interface 17310 of physics is included, which can be button, touch interface, spin, switch or any other physical User and connect Mouthful.Physical user interface 17310 can control the various aspects of glasses, the function of such as glasses, the application run on glasses or Control the application of external equipment.User can be in the following manner come easily with this control function component: catching frame Lower part stablizes it while touching control function component/UI of frame roof.Arm 17312 is laid on ear, and may include For fixing the headband of glasses, the socket of audio/ear-phone function or outer audio equipment, battery 17318 or function of supplying power etc..Electricity Pond 17318 can be placed in any arm, the option of battery 17318 be disclosed herein, but battery 17318 further includes any available Battery types.Headband can be the ear band made of Nitinol or other shapes memorial alloy.Ear band can be the shape of belt Formula, or as in Figure 177, ear band 17702 can be bent wire form, is attenuated, is mitigated and reduces cost.For beauty Purpose, frame can be any color, and lens can be any color, and the tip of eyepiece arm or at least arm can be Coloured.For example, formed arm tip Nitinol can be it is coloured.
Referring now to Figure 174, enables battery and come to the electronics in front frame, also by the way that hinge 17408 can be operated It is designed using wiring, wiring design using the electric wire of minimum number and passes through hinge in wire guide 17404.Wiring design can Including from front frame electronic device to the electric wire 17402 for the earphone being located on arm.Figure 175 depicts the amplified version of Figure 174, focus It is on the electric wire 17402 of wire guide 17404.Figure 176 A-C is described with the various pieces of frame and internal glasses Operation Profile Wire guide.The view is the user side from the frame for looking at hinge.Figure 176 A shows the section of most of material, Figure 176 B The section close to most of material is shown, and Figure 176 C shows the full release of glasses.
Fig. 6 depicts an embodiment of the eyepiece 600 with perspective or translucent lens 602.The image 618 projected can See on lens 602.In this embodiment, the image 618 being just projected on lens 602 is that wearer is seeing by chance The augmented reality version for the scene seen, wherein showing the point of interest (POI) marked in the visual field to wearer.Augmented reality version can It is enabled by the forward direction camera (being not shown in Fig. 6) being embedded in eyepiece, the imaging contents which is watching wearer And home position/POI.In one embodiment, the output of camera or optical transmitter can be sent to eyepiece controller or deposit Reservoir is checked for storing, for being sent to remote location or the people for wearing eyepiece or glasses.For example, video output can be spread It send to virtual screen and is watched for user.Therefore video output can be used for the position for assisting in user, or can remotely be sent out Other people are given to assist the position for helping to position wearer or for any other purpose.GPS, RFID, it is manually entered Other detection techniques can be used to determine the position of wearer.Using position or mark data, database can be accessed by eyepiece To obtain the information that can be applied, project or display together in other ways with the content being just seen.Augmented reality apply and Technology will further discuss here.
In Fig. 7, an embodiment of the eyepiece 700 with translucent lens 702 is depicted, in translucent lens 702 just In display streaming media (e-mail applications) and call-in reporting 704.In this embodiment, media, which have blocked, checks region A part, it should be understood, however, that shown image can be placed in from anywhere in the visual field.In embodiments, matchmaker can be made Body is more transparent or opaquer.
In one embodiment, eyepiece can be received from the external source of such as external transducer box and be inputted.Source can be depicted in mesh In the lens of mirror.In one embodiment, when external source is phone, the stationkeeping ability of phone can be used to show based on position for eyepiece The augmented reality set, including the label covering from the AR application based on label.In embodiments, eyepiece processor is operated in Or the VNC client in associated device can be used for being connected to computer and control the computer, the wherein display of computer Device is watched in eyepiece by wearer.In one embodiment, the content from any source can be streamed to eyepiece, such as from It is placed in the display of the panorama camera on vehicle top, the user interface of equipment, imaging from target drone or helicopter etc..For example, working as When the feed for the camera being mounted on gun is directed to eyepiece, which allows the shooting not target in direct sight.
Lens can be discoloration, such as photochromic or electrochromic.Electrochromism lens may include in response to place The outburst of charge that reason device applies on off-color material and the whole off-color material for changing at least part of opacity of lens Or discoloration coating.For example, and refer to Fig. 9, the color change portion 902 of lens 904 be illustrated as it is dimmed, such as the part just When providing shown content to eyepiece wearer, is provided to wearer and bigger check ability.In embodiments, exist There may be the multiple color change intervals that can be independently controlled, the major parts of such as lens, the sub-portion in the region projected on lens Point, it is the programmable regions of lens and/or the region projected, controlled etc. in Pixel-level.It can be via to the activation of off-color material The control technology further described here is controlled, or for certain applications (for example, stream transmission Video Applications, solar tracking Using the camera of the brightness in, ambient light sensor, the tracking visual field) it is automatic enable, or passed in response to the UV in insertion frame Sensor controls.In embodiments, electrochromic layer can an optical element between each optical element and/or on eyepiece Surface on, on corrective lens, on trajectory lens etc..In one example, electrochromic layer can be made of stack, all As having the PET/PC film of indium tin oxide (ITO) coating there are two electroluminescent (EC) layer therebetween, this can remove another layer Thus PET/PC reduces reflection (for example, layer stack may include PET/PC-EC-PET/PC-EC-PET/PC).In embodiments, Electrically controllable optical layer can be used as providing based on the scheme of liquid crystal for the two-spot state with color.In other embodiments, The multilayer of the liquid crystal or alternate electronics color that form optical layer can be used for providing variable color, so that optical layer is certain Layer or section can be opened or closed by grade.Electrochromic layer can be generally used for any of the transparency of the electric control in eyepiece, packet Include SPD, LCE, electrowetting etc..
In embodiments, lens can have angular-sensitive coating, which allows to transmit the light with low incidence angle Wave, and reflect the light with high incident angle, such as s polarised light.Discoloration coating can such as pass through control technology described herein Branch point integrally controls.Lens can be variable contrast, and contrast can be by pushing button or described herein The control of any other control technology.In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes optics Assembly, by the optics assembly, user check surrounding environment and shown content.Optics assembly may include correction User to the corrective element of the view of ambient enviroment, for process content so as to the integrated processor that is shown to user and For content to be introduced to the integrated image source of optics assembly.Optics assembly may include electrochromic layer, which provides dependence In the requirement of displayed content and the display Character adjustment of ambient conditions.In embodiments, display feature can be bright Degree, contrast etc..Ambient conditions can be a luminance level, and in no display Character adjustment, which will make It obtains shown content to be difficult to be checked by eyepiece wearer, wherein display Character adjustment can be applied to content in optics assembly The region being shown.
In embodiments, eyepiece can have the control such as brightness, contrast, spatial resolution in eyepiece view field, So as to for bright or dark ambient enviroment change or improvement user's checking to the content projected.For example, user may be Eyepiece is used under bright sunshine condition, in order to make user clearly see shown content, display area may need to change Brightness and/or contrast.Alternatively, checking that region can be modified around display area.In addition, either in display area also It is outside display area, the region changed spatially can be directed or control according to the application realized.Such as, it is only necessary to The fraction for changing display area, such as when the part of display area is between the display portion and ambient enviroment of display area A certain determination or predetermined comparison degree ratio deviate when.In embodiments, each section of lens can be modified brightness, comparison Degree, spatial extent, resolution ratio etc., be such as held to include entire display area, be adjusted to only a part of lens, adapt to Change in brightness-contrast of the illumination condition of ambient enviroment and/or shown content and be dynamic for this change Etc..Spatial extent (for example, the region influenced by change) and resolution ratio (for example, display optical resolution) can be in lens Different piece on change, the different piece of lens includes high-resolution position, low resolution position, single pixel position etc., What wherein different positions can be combined to realize the application being carrying out checks purpose.In embodiments, for realizing right The technology of the change of brightness, contrast, spatial extent, resolution ratio etc. may include electrochromic material, LCD technology, optical device Pearl, flexible display, suspended particulate equipment (SPD) technology, colloid technology of middle insertion etc..
In embodiments, it is understood that there may be the various activation patterns of electrochromic layer.For example, user can enter sunglasses mould Formula, wherein compound lens only seems that some dimmed or users can enter " turning dark before one's eyes " mode, and wherein compound lens has seemed Full blackening.
Realize brightness, contrast, spatial extent, resolution ratio etc. change when adoptable technology an example can be with It is electrochromic material, film, ink etc..Electrochromism is certain materials by reversibly changing appearance when charge is applied Shown phenomenon.Depending on specific application, various types of materials and structure can be used for constructing electrochromic device.Example Such as, electrochromic material includes tungsten oxide (WO3), this is the main chemical for producing electrochromic or intelligent glass. In embodiments, when realizing change, Electro-Discolor Coating be can be used on the lens of eyepiece.In another example, electroluminescent Color changing display can be used at realization ' Electronic Paper ', and Electronic Paper is designed to imitate the appearance of regular paper, wherein Electronic Paper Reflected light is shown as regular paper.In embodiments, electrochromism can be realized in various applications and material, including Gyricon(is made of the polyethylene spheres being embedded in transparent silicon sheet, and each ball suspending is in oil vacuole, so that they can Rotate freely), electrophoretic display device (EPD) (granules of pigments of charging is rearranged by using the electric field of application to form image), E- Ink technology, electrowetting, electrofluid, interferometric modulator, the organic transistor being embedded in flexible substrate, nanochromics are aobvious Show device (NCD) etc..
When realizing the change of brightness, contrast, spatial extent, resolution ratio etc., another example of adoptable technology can To be suspended particle device (SPD).When applying small voltage to SPD film, the microscopic particles in stable state are by random scatter It opens, become alignment and allows light through.Response can be immediately, uniformly, and have stable color on film.To voltage Adjustment allow user's control to pass through bright, dazzle and heat.The range of the response of system can be from complete in its closed state Block the dark blue colored appearance of light to the clear appearance in its open state entirely.In various embodiments, SPD technology can be application Emulsion in plastic supporting base, to obtain movable film.The plastic foil can be (as the single glass surface) of lamination, be suspended in two Between a sheet glass, plastics or other transparent materials etc..
With reference to Fig. 8 A-C, in certain embodiments, Electrooptical devices can be installed in simple eye or eyes according to two parts Turn over/under turn in arrangement: 1) Electrooptical devices;And 2) corrective lens.Fig. 8 A depicts two-part eyepiece, wherein electric light It learns device and is comprised in the module 802 that can be electrically connected to eyepiece 804 via electric connectors such as plug, bolt, socket, wirings In.In this arrangement, the lens 818 in frame 814 can be entirely corrective lens.Two one of electro-optical module 802 Pupil spacing between half can be adapted to various IPD in Liang808Chu.Similarly, the placement of display 812 can be via beam 808 Adjustment.Fig. 8 B depicts eyes electro-optical module 802, and wherein half is flipped up, and the other half is turned over by under.The bridge of the nose can be completely It is adjustable and elasticity.This allows with head with 3 points of installations on the bridge of the nose and ear, to ensure that image is steady in user's eye It is qualitative, and it is different from the unstability for the optical device for being mounted on the helmet being displaced on scalp.With reference to Fig. 8 C, lens 818 can To be to meet ANSI, hard conating damage resistant polycarbonate trajectory lens, it can be discoloration, can have angular-sensitive coating, It may include UV sensitive material etc..In this arrangement, electro-optical module may include the VIS/ based on CMOS for Infravision The black silicon sensor of NIR/SWIR.Electro-optic module 802 can have the feature of quick rupture capacity, replace for customer flexibility, scene It changes and upgrades.Electro-optical module 802 can have integrated power outlet.
In Figure 79, turn over/under to turn over lens 7910 may include light block 7908.Removable, elastic night adapter/light dam/light Block 7908 can be used for shielding turn over/under turn over lens 7910, such as nighttime operation.The decomposition plan view of eyepiece further depicts Head is with 7900, frame 7904 and the adjustable bridge of the nose 7902.Figure 80 depicts the electric light of positive (A) and flank angle (B) view Learn the decomposition view of assembly.The corrective lens 7910 of retainer 8012 keep perspective optical device.O ring 8020 and screw 8022 are fixed to retainer on axis 8024.Spring 8028 provides the company equipped with spring between retainer 8012 and axis 8024 It connects.Axis 8024 is connected to steel framework 8014, which is fixed on eyepiece using thumbscrew 8018.Axis 8024 takes on hinge And tool is adjusted using the IPD of IPD adjustment handle 8030.As in Figure 81 as it can be seen that handle 8030 along adjustment screw thread 8134 rotate. Axis 8024 fixes helicla flute 8132 there are two also having.
In embodiments, photochromatic layer can be used as a part of the optical device of eyepiece and be included.Photochromism It is a chemical species by inverible transform of the absorption of electromagnetic radiation between two kinds of forms, two of them form has different suctions Receive spectrum, the reversible change in the case where being exposed to given light frequency such as color, darkness.In one example, light-induced variable Chromatograph can be included between the waveguide of eyepiece and corrective optical device, is first-class on the outside of corrective optical device.In each implementation In example, photochromatic layer (being such as used as darkening layer) can use UV diode or other photochromic sound as known in the art The activation of answering property wavelength.In the case where electrochromic layer UV photoactivation, eyepiece optics device may also include outside photochromatic layer The UV coating of side, to prevent the UV light from the sun from activating it unintentionally.
Photochromic device quickly changes from light to dark at present, and slowly bright from secretly changing to.This is because photochromic material Molecular change involved in change from limpid to dark.Photochromic molecules are removed in UV light such as UV light from the sun Later, it vibrates and returns to limpid.By increasing the vibration of molecule, such as by being exposed to heat, optical device will be quickly limpid. Photochromatic layer can be from secretly to bright speed relevant to temperature.Quickly from secretly change to it is bright be even more important to Military Application, In Military Application, the user of sunglasses usually from bright external environment enter dark internal environment, and it is important to Rapidly seen in internal environment.
The present invention provides the photochromatic layer just with the heater of attachment, and heater be used to accelerate photochromic material In from secretly to limpid transformation.This method dependent on photochromic material from secretly to the relationship the speed of limpid transformation, Wherein change very fast in higher temperature.In order to allow heater to quickly increase the temperature of photochromic material, photochromic material Expect to be provided as the thin layer with thin heater.By remaining the thermal mass of the photochromic film device of per unit area Low, heater only needs to provide a small amount of heats quickly to generate the big temperature change in photochromic material.Due to photochromic Material only need to be from higher temperature be in during the transformation for secretly becoming limpid, heater only needs to use in a short time, so electric Force request is low.
Heater may be thin, transparent heating element, such as ITO heater or any other transparent and conductive film Material.When user needs eyepiece quickly to become limpid, user can activate heater first by any control technology described herein Part.
In one embodiment, heating element can be used for calibrating photochromic element, and to compensate lens, oneself dimmed Cold environmental aspect.
In another embodiment, the shallow layer of photochromic material can be placed on thick substrate, stacked on top heater member Part.For example, the sunglass lens of covering may include the photochromic scheme accelerated, but still have on the display region optionally The separated electrochromism piece of UV photocontrol is used with or without,
Figure 94 A depicts the photochromic film device with snake heater pattern, and Figure 94 B depicts photochromic films The side view of equipment, wherein the equipment is the lens for sunglasses.Photochromic film device does not contact guarantor illustrated above Shield property covers lens to reduce the thermal mass of equipment.
United States Patent (USP) 3,152,215 describes the heater layer combined with photochromatic layer, to be reduction from secretly to clear The purpose of clear fringe time heats photochromic material.However, photochromatic layer is placed in wedge shape, this will be greatly increased Thus the thermal mass of equipment simultaneously reduces the rate that heater will change the temperature of photochromic material, or greatly increase change Power needed for the temperature of photochromic material.
The present invention includes the use to the thin bearing bed for being applied with photochromic material.Bearing bed can be glass or plastics 's.As it is known in the art, photochromic material can be applied to by vacuum coating, by dipping or by thermal diffusion In bearing bed.The thickness of bearing bed can be 150 microns or smaller.Selection to bearing bed thickness is based on the light in dark state The required speed changed between the required darkness and dark state and clear state of mutagens color film device carrys out selection.Thicker holds Carrier layer can be darker in dark state, simultaneously because having bigger thermal mass and being more slowly heated to raised temperature.On the contrary, Relatively thin bearing bed is less dark in dark state, simultaneously because having lesser thermal mass and being heated to raised temperature quickly Degree.
Protective layer shown in Figure 94 is separated with photochromic film device, so that the thermal mass of photochromic film device is protected It is low for holding.By this method, protective layer can be made thicker to provide higher impact strength.Protective layer can be glass or plastics , such as protective layer can be polycarbonate.
Heater can be the transparent conductor being formed in relatively uniform conductive path, so that being formed by heater Length on the heat that generates be relatively uniform.One example of the transparent conductor that can be formed is titanium dioxide.As shown in fig. 94, Large area is provided for being in electrical contact at each end of heater pattern.
As noticed in the discussion of Fig. 8 A-C, augmented reality glasses can include lens to each eyes of wearer 818.Lens 818 can be made to be easily fitted into frame 814, so that each lens can be to be intended to customize using the people of glasses.Cause This, lens can be corrective lens, and can also be coloured for use as sunglasses, or be suitable for desired environment its Its quality.Therefore, lens can be colored as yellow, dead color or other suitable colors, or can be it is photochromic so that thoroughly The reduction when lens are exposed to brighter light of the transparency of mirror.In one embodiment, lens are also designed to fasten to frame In frame or on frame, i.e. buckle lens are one embodiment.For example, lens can be made of high quality Schott optical glass, and It may include polarization filter.
Certainly, lens need not be corrective lens;They can be used only as sunglasses or to the optical system in frame Protection.It is non-turn over/under turn in arrangement, it goes without saying that outer lens are to helping to protect the wave in fairly expensive augmented reality glasses It leads, check that system and electronic device are important.Bottom line, outer lens provide the abrasive protection to user environment no matter It is sand, blackberry, blueberry prickly bushes, brambles in an environment etc. or the flying debris in another environment, bullet and howitzer.In addition, Outer lens can be it is decorative, for changing the appearance of compound lens, may personalization to user or fashion feel there is suction Gravitation.Outer lens can also help individual user to distinguish the glasses of his or her glasses and other people, such as when many use When family flocks together.
Lens are suitable for impact, such as ballisticimpact is desirable.To, in one embodiment, lens and frame Meet the ansi standard Z87.1-2010 for being used for trajectory resistance.In one embodiment, lens also meet ballistic standard CE EN166B.In another embodiment, military affairs are used, lens and frame can meet the standard of MIL-PRF-31013, standard Or 4.4.1.1 3.5.1.1.Each of these standards have slightly different requirement to trajectory resistance, and respectively are intended to protect The eyes of user are protected not by the impact of High-velocity Projectiles or clast.Although not specifying specific material, such asGrade is gathered Carbonic ester is typically enough to the test by specifying in proper standard.
In one embodiment, as seen in fig. 8d, lens are on the outside of frame rather than inside is buckled into preferably to be rushed Resistance is hit, because any impact the outside for being expected to self-reinforcing Reality glasses.In this embodiment, replaceable lens 819 With multiple fastening arm 819a, these arms are mounted in the recess 820a of frame 820.The engagement angle 819b of arm be greater than 90 °, and The engagement angle of recess is also greater than 90 °.So that each angle is greater than right angle has the actual effect for allowing that lens 819 are removed from frame 820.Such as The vision of one people of fruit changes, or different lens it is expected if it is any reason, and lens 819 need to be removed.What is fastened sets Meter is so that there are slight compression or bearing loads between lens and frame.That is, lens can be securely held in frame, Such as pass through the slight interference fit of frame interior len.
It is not the only possible mode for removedly fastening lens and frame that the cantilever of Fig. 8 D, which fastens,.For example, can make It is fastened with annular, wherein the amplification edge of the continuous sealing lip cemented lens of frame, then the amplification edge of lens fastens to In lip, or it may be fastened on lip.It is such to fasten commonly used in pen cap is attached to pen.This configuration can have The chance that the advantages of solid connection, very small dust and dirty particle enter is smaller.Possible disadvantage include around lens and The required quite tight tolerance in the entire periphery of both frames, and the requirement to dimensional integrity in all three-dimensionals at any time.
It is also possible to using even more simple interface, which still can be considered as fastening.It can be in the outer surface of frame Middle molding slot, lens have surface outstanding, which can be considered as being assembled to the joint tongue in slot.If slot is semi-cylindrical , such as from about 270 ° to about 300 °, joint tongue will be fastened in slot and be firmlyyed hold, it would still be possible between being retained in slot Gap removes.In this embodiment, shown in Fig. 8 E, lens or replacement lens or lid 826 with joint tongue 828 can be inserted into In slot 827 in frame 825, even if the lens or lid do not fasten in frame.Because this assembly is close fit, it will Take on and fastens and lens are securely retained in frame.
In another embodiment, frame can be made into two panels, such as lower part and top, be filled using conventional joint tongue and slot Match.In another embodiment, this design also can be used standard fasteners to ensure close grasping of the frame to lens.The design The disassembly of anything on the inside of frame should not be needed.Therefore, fasten or other lens or lid should be assembled on frame or It is removed from frame, without entering on the inside of frame.As paid attention in other parts of the invention, augmented reality glasses have Many components.The careful alignment of some needs in assembly and subassemblies.Moving and shake these assemblies may It is unfavorable to its function, mobile and vibration frame and outside or fasten lens or lid may also can be unfavorable to its function.
In embodiments, turn over/under turn over arrange allow eyepiece modularized design.For example, not only eyepiece can be equipped with list Eye or eyes module 802, lens 818 can also be replaced.In embodiments, supplementary features can be included with module 802 no matter Module 802 is associated with a display 812 or two displays 812.With reference to Fig. 8 F, the simple eye or eyes version of module 802 To can be only display 852(simple eye for any one of this), 854(eyes), or before being equipped with it is simple eye to camera 858() and 860 and 862(eyes).In some embodiments, module can have additional integrated-optic device, such as GPS, laser range finder Deng.It is double in the embodiment 862 for enabling the reaction of city leader tactics, consciousness and visualization (also referred to as ' Ultra-Vis ') Eye electro-optical module 862 is equipped with before solid to camera 870, GPS and laser range finder 868.These features allow Ultra- Panorama night vision of the Vis embodiment with panorama night vision and with laser range finder and geographical location.
In one embodiment, Electrooptical devices characteristic can be but not limited to as follows:
In one embodiment, projector characteristic can be such that
In another embodiment, enhancing display eyepiece may include lens a part or work as micro-projector of electric control A part of optical device between micro-projector and waveguide.It is real that Figure 21 depicts one with such liquid lens 2152 Apply example.
Glasses, which may also include at least one, can provide the camera or optical sensor that one or more images for user are checked 2130.Image is formed by micro-projector 2114 in every side of glasses, so that the waveguide 2108 on the side is conveyed.In a reality It applies in example, additional optical element i.e. zoom lens 2152 can also be provided.Lens can be adjusted by user's electricity, so that waveguide 2108 In the image seen focus for user.In embodiments, camera can be poly-lens camera, such as ' array camera ', Middle eyepiece processor can combine the data of multiple viewpoints from multiple lens and lens to construct high quality graphic.The technology can Imaging is referred to as calculated, because software is used for the treatment of image.The advantages of imaging can provide image procossing is calculated, such as permission root According to function treatment composograph under each lens drawings.For example, processor can provide since each lens provide the image of own Image procossing is such as recessed into picture to create the image with special focusing, wherein the focus from one of lenticular image is clear , it is high-resolution etc., and wherein remaining image be defocus, low resolution etc..Composograph also may be selected in processor Each section stores in memory, while deleting rest part, such as when memory store it is limited and composograph only It is a little partially important and to save.In embodiments, the use of array camera may be provided in after image is taken more Change plan picture focus ability.Other than the imaging advantages of array camera, array camera be can provide than traditional simple lens group The thin mechanical outline of piece installing, and therefore make it easier to be integrated into eyepiece.
Variable lens may include by Varioptic company, Lyons, France city or California, USA Mountain View city So-called liquid lens provided by LensVector Co., Ltd.Such lens may include that there are two types of immiscible liquids for tool Central part.In general, light is immersed in a liquid by the path of lens, the i.e. focal length of lens by applying in these lens Interelectrode potential be modified or focus.Shadow of at least one of the liquid by obtained electric field gesture or magnetic field gesture It rings.Therefore, electrowetting can occur, such as in the U.S. Patent Application Publication 2010/0007807 for authorizing LensVector Co., Ltd Described in.Other technologies are described in the patent application publication 2009/021331 and 2009/0316097 of LensVector.Entirely These three inventions of portion are comprised in this by reference, as word for word described every page and each attached drawing here.
Other patent documents from Variopitc company describe the zoom for that can also work by electrowetting phenomenon The other equipment and technology of lens.These documents include United States Patent (USP) 7,245,440 and 7,894,440 and U.S. Patent application 2010/0177386 and 2010/0295987 are disclosed, each of these documents are comprised in this also by reference, as Every page and each attached drawing are word for word described herein.In those references, two kinds of liquid usually have different refractive index and not Same electric conductivity, such as a kind of liquid is conductive, such as aqueous liquid, and another liquid is insulation, such as oily liquids. Applying potential can be changed the thickness of lens, and change path of the light by lens really, therefore change the focal length of lens.
Electric adjustable lens can be by the control of the control of glasses.In one embodiment, focus adjustment is by from control Part recalls menu and adjusts the focus of lens to carry out.Lens can be controlled separately or can be controlled together.Adjustment is to pass through Physically rotate control handle, by what is indicated with gesture or carry out by voice command.In another embodiment, enhancing is existing Real glasses may also include rangefinder, and the focus of electric adjustable lens can by make the rangefinder of such as laser range finder be directed toward from The desired distance of user remote target or to being automatically controlled.
Such as above-mentioned United States Patent (USP) 7, shown in 894,440, variable lens can also be applied to augmented reality glasses or eyepiece Outer lens.In one embodiment, lens can simply replace corrective lens.With the variable of the adjustable control of its electricity Mirror can replace the lens being mounted on image source or projector, or the supplement as this lens.Corrective lens plug-in unit is to use Whether activity provides corrective optical device to environment, the external world, the Waveguide display at family.
It is important that making the image presented to the wearer of augmented reality glasses or eyepiece, i.e., the image seen in waveguide is steady It is fixed.The view or image presented proceeds to digital electricity from one or two digital camera or sensor being mounted on eyepiece Road, there in the display that image is apparent in glasses before, image is processed, and if it is desired, is stored as digital number According to.In any situation, and as described above, numerical data is then used to form image, such as by using LCOS display With a series of RGB light emitting diodes.Use a series of lens, polarization beam apparatus and power supply liquid corrective lens and at least one Transition lens from projector to waveguide handle light image.
It collects and the process of presentation image includes several mechanical and light linkages between each component of augmented reality glasses.From And, it is clear that some form of stabilisation will be needed.This may include to most immediate cause, camera itself (because it is installed in shifting On moving platform), the optical stabilizations of glasses (itself is movably mounted on mobile subscriber).Thus it may be necessary to phase Machine stabilizes or correction.In addition, reply liquid variable lens use at least some stabilisations or correction.It is desirable that this point at Stabilization circuit not only corrects liquid lens, also circuit upstream of the correction from liquid lens, permitted including image source Multipart any aberration and vibration.One of this system the advantage is that many commercially ready-made cameras be it is very advanced, And usually there is at least one image stabilization feature or option.Accordingly, it is possible to respectively have there are many embodiments of the invention There is the identical or different method of stable image or very quick image stream, as described below.Term optical stabilizationization usually exists This meaning for sentencing physically stabilized camera, camera platform or other physical objects is used, and image stabilization refers to counting According to manipulation and processing.
A kind of technology of image stabilization executes on the digital image when digital picture is formed.This technology can be used Buffer area of the pixel of the outside boundaries of visible frame as undesirable movement.Alternatively, the technology can be used in successive frames separately One metastable region or basis.This technology can be adapted video camera, by a manner of being enough to offset movement frame by frame The electronic image of ground mobile video.This technology independent of sensor, and by reduce vibration from mobile camera or Image is directly stablized in other distracting movements.In some technologies, the speed of image can be slowed down, so as to digital mistake The rest part of journey adds stabilization procedures, and to each image request more times.These technology uses from moving frame by frame The global motion vector that difference calculates is with the stabilized direction of determination.
The Photostabilised of image is moved or is adjusted optical element or imaging sensing using the mechanism of gravity or electric drive Device makes it offset ambient vibration.Optically the another way of stable shown content is to provide gyro correction or increases to accommodating The sensing of the platform (for example, user) of strong Reality glasses.As described above, the biography that can be used and use on augmented reality glasses or eyepiece Sensor includes MEMS gyro sensor.These sensors capture movement and movement in three-dimensional with very small increment, and can quilt Carry out the image that real time correction is sent from camera as feedback.Obviously, do not need at least and unwelcome movement it is very big by one Caused by part may be the movement as user or camera itself.These biggish movements may include the overall movement of user, example As walked or running, cycle.Lesser vibration may originate from augmented reality glasses, that is, be formed from camera (input) into waveguide Image (output) path electrically and mechanically linkage in component in vibration.These overall movements may be to correction or consideration It is even more important, rather than independence and tiny movement in the linkage of the component in such as projector downstream.In embodiments, gyro Image can be stablized when image undergoes periodic motion by stabilizing.To such periodic motion, gyroscope can determine user movement Periodically, and processor is sent information to so that the placement to the content in User is corrected.Gyroscope is in determination The rolling average of two or three in periodic motion or more circulation can be utilized when periodical.Other sensors can also by with In stablizing image or correctly place image, such as accelerometer, position sensor, range sensor, ranging in the user visual field Instrument, biosensor, geodesy sensor, optical sensor, video sensor, camera, infrared sensor, optical sensor, light Battery sensor or RF sensor.When sensor detects user's head or eyeball is mobile, sensor provides defeated to processor Out, processor can determine direction, speed, amount and the rate of user's head or eyeball movement.Processor can convert this information It is further processed at suitable data structure for controlling the processor (can be same processor) of optics assembly.Data structure It can be one or more vectors.For example, the direction of vector can define mobile orientation, and the length of vector can define movement Rate.It is exported using processed sensor, the display of content is adapted accordingly.
Therefore motion-sensing can be used for sensing and move and correct to it, such as in optical stabilization, or for sensing Then movement corrects the image for shooting and handling, such as in image stabilization.For sensing movement and correcting image or number According to device describe in figure 34 a.In this device, one or more motion sensors, including accelerometer, angle position can be used Set sensor or gyroscope, such as MEMS gyroscope.Data from sensor are fed back to sensor interface appropriate (such as Analog-digital converter (ADC)) or other suitable interfaces (such as digital signal processor (DSP)).Microprocessor and then institute as above It states and handles this information, and the frame of image stabilization is sent to display driver, be then sent to above-mentioned see-through display Or waveguide.In one embodiment, display is shown with the RGB in the micro-projector of augmented reality eyepiece and is started.
In another embodiment, video sensor or augmented reality glasses or other equipment with video sensor can quilts It is installed on vehicle.In this embodiment, video flowing can be transferred in vehicle by telecommunication capability or the Internet-enabled Personnel.One application can be sightseeing or visit to region.Another embodiment can be exploration or investigation to region, even It is patrol.In these embodiments, the gyrocontrol of imaging sensor will be useful, rather than to image or indicate image The correction of numerical data application gyro.One embodiment of this technology is described in Figure 34 B.In this technique, camera or image Sensor 3407 is installed on vehicle 3401.One or more motion sensors 3406 of such as gyroscope are installed in camera In assembly 3405.It stabilizes platform 3404 and receives information and stabilized camera assembly 3405 from motion sensor, so that shake It is minimized with rocking in camera work.This is real optical stabilization.Alternatively, motion sensor or gyroscope can be pacified It is interior mounted in stabilizing with platform sheet or stabilizing platform itself.This technology actually provides stabilized camera or imaging sensor Optical stabilization, formed pair with the digital stabilization of image is corrected by data that computer disposal camera is shot later Than.
In a kind of technology, the key of optical stabilization is before imaging sensor converts images into digital information Using stabilisation or correction.In a kind of technology, the feedback from the sensors such as gyroscope or angular-rate sensor is encoded simultaneously It is sent to actuator, actuator mobile image sensor as the focus of autofocus mechanism adjustment lens.Image Sensor is moved for maintaining the projection on image to the plane of delineation, this is that the function of the focal length of lens currently in use has can Can automatic distance correction from the rangefinder of interactive wear-type eyepiece and focus information can be obtained by lens itself.Another In one technology, the angular-rate sensor of also sometimes referred to as gyrosensor can be used for detection level respectively and vertically move. Then detected movement can be fed back to electromagnet to move the floating lense of camera.However, this optical stabilization skill Art will have to each lens for being applied to be conceived, so that result is fairly expensive.
U.S. Patent Application Publication of the stabilisation of liquid lens in the Varioptic company that Lyons, France city is awarded It is discussed in 2010/0295987.It theoretically, is relatively simple to the control of liquid lens, because only existing to be controlled one A variable: the voltage level that the electrode in the conductive and nonconductive liquid of lens is applied, such as use lens case and lid As electrode.Apply voltage and causes change or inclination in liquid-liquid interface via electrowetting effect.This change or inclination Adjust the focus or output of lens.In its most basic situation, there is the control program of feedback then will apply voltage and determination Effect of the voltage applied to result, the i.e. focus or astigmatism of image.Then voltage can apply according to various modes, such as phase Deng and opposite+and-voltage, two positive voltages of different amplitudes, two negative voltages of different amplitudes etc..Such lens are claimed For electric variable optical lens or electrooptics lens.
Voltage can apply to electrode in a short time according to each mode, and carry out the inspection of focus point or aberration.Inspection can example Such as carried out by imaging sensor.In addition, the sensor on lens in sensor or such case on camera, can detect camera Or the movement of lens.Motion sensor may include being mounted on liquid lens or one of the optical train of very close liquid lens Accelerometer, gyroscope, angular-rate sensor or piezoelectric transducer on point.In one embodiment, then building is applied The table for such as calibrating table of voltage needed for voltage and correction angle or given mobile and horizontal.It can also be by the different portions of liquid Point in use segmented electrode so that four voltages can be applied rather than two, Lai Zengjia complexity.Certainly, if using four electricity Pole can then apply four voltages, cause than using the only much more mode of two electrodes.These modes may include for opposite panel The equal or opposite positive and negative voltage etc. of section.One example is described in Figure 34 C.Four electrodes 3409 are installed in liquid lens In shell (not shown).Two electrodes are installed in non-conductive liquid or near it, and another two electrode is installed in conduction In liquid or near it.It is independent for the possibility voltage that each electrode can apply.
It searches or calibration table can be constructed and be placed in the memory of augmented reality glasses.In use, accelerometer or Other movable sensors are by the movement of sensing spectacles (camera or lens i.e. on glasses) itself.The motion-sensing of such as accelerometer Device will especially sense the movement for the tiny oscillating mode smoothly transmitted for interfering image to waveguide.In one embodiment, this Electrically controllable liquid lens can be applied to by locating the image stabilization techniques, so that the image from projector is by school immediately Just.This is by the output of stable projection instrument, at least partly at least the one of the vibration to augmented reality eyepiece and movement and user A little movements are corrected.It also can exist for manually controlling for the other parameters of adjust gain or correction.It is noted that except image passes Other than the focus adjustment that a part that sensor control provides and as adjustable focus projection instrument discusses, this technology It may be additionally used for the myopia or long sight of correcting individual user.
Another zoom element is using tunable liquid crystal cells with focus image.These elements are for example in U.S. Patent application public affairs It opens and is disclosed in 2009/0213321,2009/0316097 and 2010/0007807, these patent applications are by quoting its whole quilt It is incorporated herein and as foundation.In this approach, liquid crystal material is comprised in transparent cell, it is therefore preferred to have matched folding Penetrate rate.Unit includes transparent electrode, such as the electrode made of indium tin oxide (ITO).Using a spiral electrode and Second spiral electrode or plane electrode, spatially non-uniform magnetic field are applied.Other shapes of electrode can be used.Magnetic field Shape determine the rotation of molecule in liquid crystal cells, lens focus is changed with realizing the change of refractive index and therefore realizing Become.Therefore liquid crystal can be changed its refractive index by electromagnetically-operated, so that tunable liquid crystal cells take on lens.
In the first embodiment, tunable liquid crystal cells 3420 are described in Figure 34 D.The unit includes the internal layer of liquid crystal 3421 and such as polyimides directional material thin layer 3423.This material facilitates liquid crystal aligning in preferred orientations In.Transparent electrode 3425 is located on every side of directional material.Electrode can be plane, or can be such as the right side in Figure 34 D It is spiral shown in side.Transparent glass substrate 3427 includes the material in unit.Electrode is formed, so that they borrow shape To magnetic field.As noted, in one embodiment using the spiral electrode on side or bilateral, so that two sides are not symmetrical 's.Second embodiment is described in Figure 34 E.Tunable liquid crystal cells 3430 include center liquid crystal material 3431, transparent glass lining Bottom wall 3433 and transparent electrode.Bottom electrode 3435 is plane, and top electrodes 3437 are spiral-shaped.Transparent electrode can It is made of indium tin oxide (ITO).
Additional electrode can be used for liquid crystal to amorphism or the Fast Restoration of nature.Small control voltage therefore by with Dynamically to change the refractive index for the material that light passes through.The spatially non-uniform magnetic field of shape needed for voltage generates allows liquid crystal Take on lens.
In one embodiment, camera includes black silicon, the short-wave infrared (SWIR) in other place descriptions of this patent Cmos sensor.In another embodiment, camera is the video sensor of 5,000,000 pixels (MP) optical stabilization.Implement at one In example, control includes 3GHz microprocessor or microcontroller, and be may also include with for from camera or video sensor Image carry out scan picture 30M polygon/second graphics accelerator 633MHz digital signal processor.In a reality It applies in example, augmented reality glasses may include for broadband, personal area network (PAN), Local Area Network, wide area network, WLAN, follow IEEE802.11 or wireless Internet, radio or the telecommunication capability for looking back communication.The kit provided in one embodiment Include the bluetooth capability for following IEEE802.15.In one embodiment, augmented reality glasses include the encryption for secure communication System, such as 256 advanced ciphering system (AES) encryption systems or other suitable encipherors.
In one embodiment, aerogram may include the ability for 3G or 4G network, and may also include wirelessly because of spy Net ability.For the extended service life, augmented reality eyepiece or glasses may also include at least one lithium ion battery, and institute as above It states, including charging ability.Charging plug may include AC/DC power adapter, and be able to use multiple input voltages, such as 120 Or 240V.The control of the focus for adjusting adjustable punktal lens includes 2D or 3D wireless air mouse in one embodiment Mark is other in response to the posture of user or the non-contact control of movement.2D mouse can be from California, USA Fei Limeng Logitech Company, city buys.3D mouse is described herein as, or other mouses can be used, and can such as be purchased from Taiwan Cideko company The Cideko AVK05 obtained.
In one embodiment, eyepiece may include the electronic device for being adapted control optical device and associated system, packet Include central processor unit, nonvolatile memory, digital signal processor, 3-D graphics accelerator etc..Eyepiece can provide additional Electronic component or functional component, including inertial navigation system, camera, microphone, audio output, power supply, communication system, sensing Device, code table or isochronon function, thermometer, vibration temple part motor, motion sensor, enabling are to the audio frequency control of system Microphone enables contrast and the UV sensor of light modulation etc. to photochromic material.
In one embodiment, the central processing unit (CPU) of eyepiece can be OMAP4, have dual 1GHz processor Core.CPU may include 633MHz DSP, give 30M polygon/second ability to CPU.
Double micro-SD(secure digitals can also be provided in system) slot is for providing additional removable non-volatile memory.
Onboard camera can provide the color of 1.3MP, and record up to 60 minutes video footages.The video recorded can quilt Mini-USB transmission equipment can be used to unload film in Wireless transceiver.
Chip-on communication system (SOC) can be with wide area network (WLAN), versions 3.0, GPS receiver, FM radio etc. Operation.
The operation of 3.6VDC lithium ion chargeable battery can be used in eyepiece, to obtain longer battery life and easy to use. Additional power supply can be provided by the solar battery outside system framework.These solar batteries can power, and can also be right Lithium ion battery charging.
The total power consumption of eyepiece is about 400mW, but depends on used feature and application and change.For example, having very The processor sensitive application of more video and graphics requires more power, and will be close to 400mW.It is relatively simple, less video is sensitive Using less power will be used.The operating time once charged can also use with application and feature and be changed.
Micro-projector illumination engine, herein also called projector, it may include multiple light emitting diodes (LED).In order to mention For lifelike color, the blue led of the red LED of Osram company, the green LED of Gree company and Cree company is used. These are the LED based on tube core.RGB engine can provide adjustable color output, and allowing user is various programs and optimizing application Viewing.
In embodiments, illumination can be added to glasses or control illumination by various modes.For example, LED light or its Its lamp can be embedded in the frame of eyepiece, be such as embedded in the bridge of the nose, around compound lens or at temple part.
The intensity of illumination or the color of illumination can be modulated.Modulation can be passed through by various control technologies described herein Various applications, filtering and amplification, to complete.
As an example, illumination can be modulated by various control technologies described herein, such as to the adjustment of control handle, Gesture, eyeball movement or voice command.If user wants to increase the intensity of illumination, which can adjust the control handle on glasses Control handle on hand or his adjustable lens in shown user interface, or otherwise.Eye can be used in user Ball is mobile to control the control handle shown on lens or he can control handle otherwise.User can pass through hand Mobile or other bodies are mobile to be illuminated to adjust, so that the intensity or color of illumination are changed based on the movement that user carries out.And And user can adjust illumination by voice command, such as by saying the illumination or request to show it that request increases or decreases The phrase of its color.In addition, illumination modulation can be realized by any control technology described herein or by other means.
In addition, illumination can be modulated according to the specific application being carrying out.As an example, using can be based on the application most The color of intensity or illumination that excellent setting adjust automatically illuminates.If current illumination level be not the application that is carrying out most Excellent water is flat, then message or order can be transmitted to provide illumination adjustment.
In embodiments, illumination modulation can be completed by filtering or by amplifying.For example, can be used allows the strong of light Degree and/or color are changed and filtering technique that optimal or required illumination is implemented.Moreover, in embodiments, illumination Intensity can be by be modulated to reach required illumination intensity using larger or smaller amplification.
Projector can be connected to display to export video and other display elements to user.Used display can To be SVGA800x600 point/inch SYNDIANT liquid crystal over silicon (LCoS) display.
The target MPE size of the system can be 24mm x12mm x6mm.
Focus can be it is adjustable, allow user improve projector export to be suitble to its demand.
Optical system can be comprised in the shell made of 6061-T6 aluminium and glass-filled ABS/PC.
In one embodiment, the weight of system is estimated as 3.75 ounces i.e. 95 gram.
In one embodiment, eyepiece and associated electronic device provide Infravision.Infravision can pass through black silicon SWIR sensor enables.Black silicon is silicon-based complementary metal oxide (CMOS) processing technique for making the photoresponse of silicon enhance 100 times. Spectral range is deeply expanded to short-wave infrared (SWIR) wave-length coverage.In this technique, the absorption of 300nm depth and antireflection Layer is added to glasses.This layer provides improved responsiveness as shown in Figure 11, wherein the responsiveness of black silicon in visible light and It is more much higher than silicon in NIR range, and extend to SWIR range.This technology is asked to enduring extremely high cost, performance The improvement of topic and high volume manufacturability problem current techniques.This technology is included in night vision optical device for CMOS technology Economically the advantages of, is taken in design.
From amplification starlight or current nigh-vison googles (NVG) of other environment light from visible spectrum is different, SWIR Sensor picks up each photon and the light in SWIR spectrum is converted into electric signal, is similar to digital photography.Photon can be by night Reconfigure naturally (also referred to as " nightglow ") of oxygen and hydrogen atom generates in atmosphere.Short-wave infrared equipment passes through at night in institute Starlight, urban lighting or the moonlight of reflection are interior to detect invisible, short wave infrared radiation to see object.They also on daytime, or It works through mist, haze or cigarette, and current NVG image intensifier infrared sensor will be overwhelmed by heat or brightness.Because shortwave is red External equipment picks up the invisible radiation on visible spectrum edge, and SWIR image appears as the image of visible light generation, has phase Same shade and contrast and face detail, are only black and white, sharply enhance identity, so people appears as people; The bulk that they are commonly seen unlike use thermal imagers.Important SWIR ability first is that afield provide aiming laser device View.Aiming laser device (1.064um) is sightless using current nigh-vison googles.Using SWIR Electrooptical devices, Soldier will check each aiming laser device in use, including those of use laser by enemy.With do not penetrate vehicle On window or building thermal imagers it is different, it is seen that/near-infrared/short-wave infrared sensor it is see-through they, either daytime Or at night, important tactical advantage is given to user.
Certain advantages include only using active illumination when needed.In some cases, night may have it is enough from So illumination, such as during full moon.It when such is the case, the use of the artificial night vision of active illumination may not be required.It adopts With the SWIR sensor based on black silicon CMOS, active illumination may not be needed in these conditions, and be not provided with source lighting, by This improves battery life.
In addition, it is more than to obtain under night state of the sky in expensive InGaAs sensor that black silicon image sensor, which can have, Octuple signal-to-noise ratio of the signal-to-noise ratio arrived.This technology also provides better resolution ratio, provides than using current techniques for night Depending on the resolution ratio of available high resolution much.It is difficult to be explained have generally, based on the SWIR of the CMOS long wavelength's image generated There is good heat detection, but resolution ratio is poor.This problem uses the black image silicon SWIR sensing of the wavelength dependent on much shorter Device solves.For these reasons, to battlefield night vision goggles, SWIR is highly desirable.Figure 12 shows black silicon night vision technology Validity, provide and pass through a) dust;B) mist and the image before and after c) cigarette is checked.Image in Figure 12 is shown newly The performance of the black silicon sensor of VIS/NIR/SWIR.In embodiments, imaging sensor can distinguish the change in natural environment, The ground etc. of the vegetation, disturbance that disturb.For example, enemy combatant may placed destructor on the ground in the recent period, So the ground on explosive will be on ' ground of disturbance ', and imaging sensor (and the processing work inside or outside eyepiece Tool) ground disturbed in the recent period and surrounding ground can be distinguished.By this method, soldier can detect underground explosion device at a distance The placement of (for example, Improvised Explosive Device (IED)).
Previous night vision system is by " halation " from bright source, such as street lamp.These " halation " are in image enhancement It is especially severe in technology and also associated with resolution loss.In some cases, cooling system is in image enhancement technique system In be it is required, increase weight and shorten battery life.Figure 17 shows A) it is able to carry out VIS/NIR/SWIR imaging The flexibility platform and B of non-cooled formula cmos image sensor) difference of picture quality between Image enhancement night vision system.
Figure 13 is depicted between current or existing vision enhancement technology 1300 and non-cooled formula cmos image sensor 1307 Difference in structure.Existing platform (Figure 13 A) carries out deployment due to cost, weight, power consumption, spectral range and integrity problem Limitation.Existing system is usually by front lens 1301, time electricity grade 1302, microchannel plate 1303, high-voltage power supply 1304, phosphorous screen Curtain 1305 and eyepiece 1306 are constituted.This with can with the sub-fraction of cost, power consumption and weight carry out VIS/NIR/SWIR imaging The flexibility platform (Figure 13 B) of non-cooled formula cmos image sensor 1307 be contrasted.These much simpler sensor packets Include front lens 1308 and the imaging sensor 1309 with digital picture output.
These advantages are originated from the processing technique of CMOS compatible, which improves the photoresponse of silicon more than 100 times, and will frequency Spectral limit extends deep into short-wave infrared field.The difference of responsiveness is shown in Figure 13 C.Although typical nigh-vison googles limit In UV, visible light and near-infrared (NIR) range, until about 1100nm(1.1 microns), new-type cmos image sensor range is also Including short-wave infrared (SWIR) frequency spectrum, extension reaches 2000nm(2 microns).
Black silicon nuclear technology, which can provide, significantly improves current night vision goggles.Femtosecond laser laser doping can be in very wide frequency Enhance the light detection property of silicon in spectrum.In addition, optic response can be enhanced 100 times to 10,000 times.With current night vision system It compares, black silicon technology is quick, scalable and CMOS compatible the technology with unusual cost.Black silicon technology also provides low work Make bias, usually 3.3V.In addition, non-cooled formula performance may reach 50 DEG C.The cooling of current techniques is required to increase weight Both amount and power consumption, and also cause the discomfort of user.As described above, black silicon nuclear technology provides for current image intensifier technology High-resolution replacement.Black silicon nuclear technology can provide high-speed electronic shutters according to up to 1000 frames/second speed and have minimum Crosstalk.In some embodiments of night vision eyepiece, relative to other optical displays, such as LCoS display, it may be preferred to OLED display.
Accommodating the black silicon sensor of VIS/NIR/SWIR can provide better Situation Awareness (SAAS) monitoring and realtime graphic increasing By force.
In some embodiments, the black silicon sensor of VIS/NIR/SWIR can be included in the desktop, laptop for being only suitable for night vision In, in such as nigh-vison googles or Night vision helmet.Nigh-vison googles may include the feature for being adapted to military market, such as firm And the power supply of alternate form, and other forms factor may be adapted to consumer or toy market.In one example, nigh-vison googles can With extended range, such as 500-1200nm, and also act as camera.
In some embodiments, the black silicon sensor of VIS/NIR/SWIR and other outer sensors, which can be included into, to be pacified In the camera of installation in transport or fighting machine, so that video can be and being superimposed upon in frontal view by Real-time Feedback It is not blocked to be sent to the driver of vehicle or other occupants.Driver can preferably see what he or she was going to Place, gunner can preferably see unexpected threat or target, and navigator can preferably perceive Situation Awareness (SAAS) simultaneously Also searching threat.Feedback can also be sent to non-at-scene position on demand, and the high-rise general headquarters of such as memory/storage location supply It is used later in run-home, navigation, monitoring, data mining etc..
The further advantage of eyepiece may include steady connectivity.The connectivity allow using bluetooth, the internet Wi-Fi/, Honeycomb, satellite, 3G, FM/AM, TV and UVB transceiver for quickly sending/receiving mass data are downloaded and transmit. For example, UWB transceiver can be used for creation very High Data Rate, low probability of intercept/low detection probability (LPI/LPD), wireless People's Local Area Network (WPAN) is come the mouse/controller, E/O sensor, medical treatment sensing that connect weapon sight, install on weapon Device, audio/visual displays etc..In other embodiments, other communication protocols can be used to create WPAN for example, WPAN is received and dispatched Machine can be the modularization front end for complying with COTS, be high responsiveness to make the power management of fight radio, and avoid Endanger the robustness of radio.By the way that ultra wide band (UWB) transceiver, base band/MAC and encryption chip are integrated in a module, It just obtains physically small-sized dynamic and configurable transceiver solves a variety of operational requirements.WPAN transceiver is worn in soldier Low-power, encrypt, wireless personal domain network (WPAN) are created between the equipment worn.WPAN transceiver can be affixed to Or in the substantially any battlefield military equipment with network interface of insertion (handheld computer, fight display etc.).This is System can support many users, AES encryption, be for human interference and the RF robustness interfered and for fight Preferably, low probability of intercept and detection probability (LPI/LPD) are provided.WPAN transceiver eliminates the data cable with soldier Volume, weight and " caused by obstacle ".Interface include USB1.1, USB2.0OTG, Ethernet10/100Base-T and RS2329 needle D-Sub.For the up to variable range of 2m, power output can be -10, -20dBm output.Data capacity can be with It is 768Mbps and bigger.Bandwidth can be 1.7GHz.Encryption can be 128,192 or 256 AES.WPAN transceiver It may include message authentication code (MAC) generation of optimization.WPAN transceiver may conform to MIL-STD-461F.WPAN transceiver can be adopted With the form of connector dust cover, and it can be attachable to the military equipment in any battlefield.WPAN transceiver allows while being regarded Frequently, speech, still photo, text and chat eliminate the needs to the data cable between electronic equipment, allow to set to multiple It is standby to proceed without the control with hand without diverting one's attention, characterized by adjustable connectivity range, there is Ethernet and USB2.0 Interface, by adjustable frequency 3.1 to 10.6GHz and 200mw peak value energy consumption and it is nominal it is standby characterized by.
For example, WPAN transceiver is allowed in the eyepiece 100 for showing glasses form, calculating of fighting that come back using GSE solid WPAN is created between biometric information registering apparatus as seen in machine, remote computation set remote-controller and Figure 58 kind.Another In one example, if WPAN transceiver allow using turn over/under turn over new line display fight eyes, HUD CPU(it be outer Portion), before weapon grip controller and similar to shown in Figure 58 it is preceding it is arm computerized between create WPAN.
Eyepiece can provide the cellular connectivity of their own, such as be connect by the individual radio with cellular system.Personal nothing Line connection may be only available to the wearer of eyepiece or it can be available to multiple adjacent users, such as in Wi-Fi Hotspot (such as WiFi), wherein eyepiece provides local hot spot and utilizes for other people.These adjacent users can be other wearings of eyepiece Person or the user of other a certain wireless computer devices, such as mobile communication equipment (such as mobile phone).Pass through the individual radio Connection, wearer may not be needed other honeycombs or the Internet radio is connected to wireless service.For example, if not collecting At the individual radio connection in eyepiece, wearer may have to find WiFi connection point or the mobile communication for being connected to them Equipment is wirelessly connected to establish.In embodiments, eyepiece can be by being integrated in mesh for these functions and user interface Replace in mirror to the needs for possessing separated mobile communication equipment (such as mobile phone, mobile computer).For example, eyepiece can With integrated WiFi connection or hot spot, true or dummy keyboard interface, usb hub, loudspeaker (such as by music stream It is transmitted to loudspeaker) or loudspeaker input connection, integrated camera, external camera etc..In embodiments, what is connect with eyepiece is outer Portion's equipment can provide the single list that (such as WiFi, cellular connection), keyboard, control panel (such as touch pads) are connected with personal network Member.
Communication from eyepiece may include communication linkage for a specific purpose.For example, with a small amount of time send and/or The mass data moment is received using ultra-wideband communications link.In another example, in the case where very limited transmission range Can be used near-field communication (NFC) link so as to individual very close to when photos and sending messages send individual to, such as tactics original Cause is used for local direction, for warning etc..For example, soldier can safely send out/hold information, be transferred only to need know Or need using the information very close to people.In another example, wireless personal domain network (PAN) can be used to for example Mouse/controller, photoelectric sensor, medical treatment transducer, the audio-visual display installed on connection weapon sight, weapon Deng.
Eyepiece may include the inertial navigation system based on MEMS, such as GPS processor, accelerometer are (such as enabling The head of system controls and other function), gyroscope, altimeter, inclinometer, speedometer/odometer, laser range finder, magnetic Power meter, it is but also image stabilization is possibly realized.
Eyepiece may include that such as clear earplug 120 etc to user or wearer provides the earphone integrated of audio output.
In one embodiment, the camera (see Figure 21) of the face forward integrated with eyepiece allows basic augmented reality. In augmented reality, viewer can be imaged to what is just watched, then by it is enhancing, compiled, tagged or by The version layering of analysis is deposited on basic views.In alternative embodiments, associated data can together with primary image or It is shown on primary image.If two cameras are provided and are mounted at the correct interocular distance of user, can create vertical Volumetric video image.This ability may be useful for the individual for needing eyesight to assist.Many people are subjected to their eyesights Defect, such as myopia, long sight etc..Camera and as described herein very close virtual screen provide for these people " depending on Frequently ", which is adjustable (closer or farther) in terms of focus, and can be come by individual by speech or other orders complete Control.The ability is also likely to be useful, such as cataract, retinitis pigmentosa for the individual by disease of eye Deng.As long as certain organic visual capacity keeps existing, augmented reality eyepiece can help individual to see more clearly.Each implementation of eyepiece Example can by one of the following or it is multiple characterized by: amplification, increased brightness, by content map to eyes still health area The ability in domain.Each embodiment of eyepiece is used as bifocal or magnifying glass.Wearer can increase in the visual field scaling or Increase scaling in partial visual field.In one embodiment, associated camera can get the image of object, then with scaled Photo is presented to the user.User interface allows wearer to be directed toward the region of his desired scaling, such as utilizes control as described herein Technology processed, so that image procossing can keep being absorbed in particular task compared with all things in the visual field for only amplifying camera.
In a further embodiment, the camera (not shown) towards after also can be incorporated in eyepiece.In the embodiment In, the camera towards after allows to control the eyes of eyepiece, and his or her eyes are directed toward on eyepiece by user and are shown The specific project shown makes application or feature selecting.
Further embodiment for capturing the equipment of the biometric data about individual can stretch microcaloire Cassegrain It is integrated in a device that formula folds optical camera.Microcaloire Cassegrain fold concertina-wise optical camera may be mounted to that such as biological plating In the handheld device of equipment, biological phone etc, the biology set for being used as acquisition biometric data on glasses also may be mounted to that A part of part.
Cassegrain reflecting mirror is the combination of main concave mirror and auxiliary convex mirror.These reflecting mirrors are generally used for optical telescope In wireless aerial, because they provide good light (or sound) acquisition capacity with shorter, smaller packing forms.
In symmetrical Cassegrain, two reflecting mirrors are aligned about optical axis, and primary mirror usually has hole in center, and light is allowed to reach Eyepiece or camera chip or light detecting device, such as CCD chip.A kind of usually alternate design used in radio telescope Final focus is placed on before principal reflection mirror.Further these reflecting mirrors of alternate design tiltable come avoid hindering it is main or Auxiliary reflecting mirror, and the needs to the hole in principal reflection mirror or auxiliary reflecting mirror can be eliminated.Microcaloire Cassegrain fold concertina-wise optics phase Any one of above-mentioned modification can be used in machine, and final choice is determined by the desired size of optical device.
Traditional Cassegrain configuration 3500 uses parabolic mirror as primary mirror, and hyperboloidal mirror is as auxiliary mirror. Hyperboloid primary mirror and/or spherical surface or oval auxiliary can be used in the further embodiment of microcaloire Cassegrain fold concertina-wise optical camera Mirror.In operation, light is reflected down back through in primary mirror by traditional Cassegrain with paraboloid primary mirror and the auxiliary mirror of hyperboloid Hole, as shown in Figure 35.Fold optical path to design it is more compact, and use micro-size, be adapted with it is as described herein Biological plating sensor is used together with biological plating external member.In folded-optics system, light beam is bent so that optical path It is longer than the physical length of system.One common example of folded-optics system is prismatic binocular.In camera mirror In head, auxiliary mirror be may be mounted to that on an optically flat, optically transparent glass plate of barrel.This support is eliminated As caused by prismatic blade shape support tripod " star " diffraction effect.This allows the lens barrel of seal closure, and protects master Mirror, but cause certain losses of light collection ability.
Cassegrain design also utilizes the specific properties of parabolic mirror and hyperboloidal mirror.Concave paraboloid reflection All incident rays for being parallel to its symmetry axis are reflected into single focus by mirror.There are two focuses for convex hyperboloid mirror tool, and All light for being directed toward a focus are reflected towards another focus.Reflecting mirror in this seed type camera lens is designed and is positioned to altogether A focus is enjoyed, the second focus of hyperboloidal mirror is put at the identical point in place being observed with image, usually just Outside eyepiece.The parallel rays for entering camera lens is reflected into its focus by parabolic mirror, the focus and hyperboloidal mirror Focus is consistent.Then those light are reflected into another focus by hyperboloidal mirror, cameras record image at this.
Figure 36 shows the configuration of microcaloire Cassegrain fold concertina-wise optical camera.The camera may be mounted to that augmented reality eye On eyeball, on biological phone or in other biological identification information acquisition equipment.Component 3600 has multiple telescopic segments, allows camera As cassegrainian optical system stretches, longer optical path is provided.Screw thread 3602 allows camera to be installed in equipment, such as increases Strong Reality glasses or other biological identification information acquire equipment.Although the embodiment described in Figure 36 uses screw thread, can also Using other mount schemes, such as bayonet mount, knob or cover.First telescopic segment 3604 acts also as camera lens in fully retracted position When outer enclosure.Camera also drives the stretching, extension of camera and is inside contracted in combination with motor.It may also include the second telescopic segment 3606.Its The telescopic segment of his embodiment in combination with different number, the length of optical path needed for this depends on selected task or the data to be collected Degree.Third telescopic segment 3608 includes camera lens and reflecting mirror.If camera is designed to follow traditional Cassegrain design, reflect Mirror can be principal reflection mirror.Auxiliary reflecting mirror can be comprised in the first telescopic segment 3604.
Further embodiment can form camera using micro-reflector, while still through using folded-optics system To provide longer optical path.Identical principle is designed using with Cassegrain.
Camera lens 3610 provides the optical system that the folded-optics system for designing with Cassegrain is used together.Mirror First 3610 can select from various types, and can be changed according to application.Screw thread 3602 allows various cameras according to the need of user To exchange.
The eyes control that feature and option select can be controlled by the object recognition software loaded on system processor and Activation.Object recognition software allows augmented reality, and identification output and inquiry database are combined, and by identification output and determines Correlation/likelihood calculating instrument is combined.
In the additional embodiment for combining 3D projector, three-dimensional viewing is also possible.The Miniature projector of two stackings (not shown) can be used for creating 3-D image output.
With reference to Figure 10, the multiple digital CMOS sensor (microprocessors of each sensor array and projector with redundancy And DSP) detection visible light, near infrared light and short-wave infrared light, to allow passively day and night to operate, such as Real-time image enhancement 1002, real-time keystone correction 1004 and real-time virtual perspective correction 1008.Eyepiece can be sensed using Digital CMOS image Device and directional microphone as described herein (such as microphone array), for example, for being visually imaged monitor visible scene (as biology Identification, ability of posture control, with 2D/3D projection map carry out coordination imaging), IR/UV imaging come carry out scene enhancing (such as perspective mist Haze, cigarette, dark), audio direction sensing (direction of such as gunslinging or explosion, text hegemony).In embodiments, these are sensed Each of device input can be fed into digital signal processor (DSP) to handle, such as set inside eyepiece or with external treatment The DSP of standby interface.Then useful information number can be generated algorithmically to the output of the DSP processing of each sensor input streams According to mode be combined.For example, the system may be useful for combination below: real-time face identification, in real time words Sound detection is analyzed by the link to database, whiles especially with distortion correction and soldier, attendant etc. GPS positioning, such as when monitoring interested remote region, such as known path or trail or highly safe region.One In embodiment, be input to DSP audio direction sensor input can be processed to the user of eyepiece generate one or more can Depending on, audible or vibration queue indicate the direction of sound.For example, if obstructed using hearing protection it is loud explosion or The sound of gunslinging to protect the hearing of soldier, or if explosion ring very much so that soldier do not can say it from where and they Ear may ring ground hummed so that they do not hear anything very much now, may be used to the audible of operator Or vibration queue indicate the direction of original threat.
Augmented reality eyepiece or glasses can be powered by any energy-storage system, such as battery power supply, solar powered, route supply Electricity etc..Solar collector can be placed on frame, on belt fastener etc..Battery, which charges, can be used wall type charger, vehicle-mounted fills Electric appliance carries out on belt fastener, in spectacle case etc..In one embodiment, eyepiece can be rechargeable, and can match Have the small USB connector for rushing electricity again.In another embodiment, eyepiece can be equipped with through one or more long-range inductions Formula power converter topology carries out long-range inductive charging, such as the Powercast of Pennsylvania, America Ligonier;And Fulton Int ' l.Inc.(the said firm of Michigan, USA Ada also possesses another provider, Britain Camb Those of Splashpower, Inc.) provide.
Augmented reality eyepiece further includes camera and camera is connected to any excuse needed for circuit.The output of camera can It is stored in memory, is also displayed on the available display of wearer of eyes.Display driver can also by with In control display.Augmented reality equipment further includes the power supply, electric power management circuit, Yi Jiyong of battery for example as shown etc In the circuit recharged to power supply.As explained elsewhere, (such as small USB connector) can be connected by rigid line or pass through sense by recharging Device, solar panel input etc. is answered to occur.
The control system of eyepiece or glasses may include for the saving when the power supply instruction of such as battery etc goes out low battery The control algolithm of power supply.The saving-algorithm may include the power supply for closing the application to energy-sensitive, and such as illumination, camera or requirement are high Sensor of energy level, such as any sensor for requiring heater etc..It may include slowing down for sensing that other, which save step, The power supply of device or camera, for example, slow down sampling or frame per second, when electric power is low enter it is lower sampling or frame per second, in lower level When closure sensor or camera.To have at least three kinds of operation modes: normal mode according to available power;Battery saving mode;With And urgent or close pattern.
The disclosure application can be controlled by the movement and direct action of wearer, such as his or her hand, finger, The movement of foot, head, eyes etc. passes through equipment (such as accelerometer, gyroscope, camera, optical sensor, the GPS sensor of eyepiece Deng) and/or pass through the equipment that wearer wears or be mounted on wearer (such as installation sensor control device physically) To enable.By this method, wearer can directly control eyepiece by the movement and/or movement of their body, without using Traditional hand-held remote controller.For example, wearer can have be mounted on one or two on hand (such as at least one finger, On palm, on the back of the hand etc.) such as position sensor device etc sensor device, wherein the position sensor device provides hand Position data, and to eyepiece provide position data wireless communication as command information.In embodiments, the disclosure Sensor device may include gyroscope apparatus (such as electronic gyroscope, MEMS gyroscope, mechanical gyro in terms of being supplied to location information Instrument, Quantum gyroscope, Ring Laser Gyro instrument, fibre optic gyroscope), accelerometer, mems accelerometer, velocity sensor, power Sensor, pressure sensor, optical sensor, proximity sensor, RFID etc..For example, wearer can be in their right hand index finger On position sensor device is installed, wherein the equipment can sense the movement of the finger.In this example, user can or pass through Certain switching mechanism on eyepiece (such as fast moves finger, finger tapping is hard by the predetermined motion sequence of finger Surface etc.) activate eyepiece.Note that tapping hard surface can by the sensing of accelerometer, force snesor, pressure sensor etc. come It explains.Then position sensor device can transmit the movement of finger as command information, such as move finger in the sky and come across display The mobile cursor of image that is out or projecting moves in quick movement to indicate selection etc..In embodiments, position is felt The command information sensed can be transmitted directly to that eyepiece is used for command process or command process circuit can be with position by measurement equipment Sensor device, which is located at, to exist together, such as is mounted on the component on finger as the sensor for including position sensor device in this example A part.Command information can be with visual detector.For example, cursor can change color when interacting with different content.For example, In order to know you when using peripheral equipment control glasses you finger where, visually indicating for command information can be existed by reality In glasses.
In embodiments, multiple position sensor devices can be mounted on their body by wearer.Such as and continue The example of front, position sensor device can be mounted on multiple points on hand by wearer, such as each sensor is different On finger, or as the set of sensor, such as in gloves.By this method, the sensing at different location on hand Total sensor command information of the set of device can be used for providing more complicated command information.For example, being simulated in the disclosure With play simulation in use, user can be used sensor device gloves and play game, the wherein hand pair of gloves sensing user The grasping of ball, bat, racket etc..In embodiments, multiple position sensor devices may be mounted to that on the different parts of body, Allow wearer by the compound movement of body be transmitted to eyepiece come by certain using.
In embodiments, sensor device can have force snesor, pressure sensor etc., such as detecting sensor device When with object contact.For example, sensor device may include the pressure sensor at the finger tip of the finger of wearer.In the situation In, wearer can tapping, multiple tapping, draw brush, touch etc. come to eyepiece generate order.Force snesor may be alternatively used for indicating The degree touch, hold, pushing away etc., wherein threshold value that is scheduled or learning determines different command information.It by this method, can be according to not Disconnected update transmits order by a series of serial commands of eyepiece command information used in a certain application.In an example In, wearer may be currently running simulation, such as game application, Military Application, business application etc., wherein moving and and object Contact (such as passing through at least one of multiple sensor devices) be fed into eyepiece as influence by eyepiece show should The order of simulation.For example, a certain sensor device can be included in a controller, wherein controller can have force snesor, Pressure sensor, Inertial Measurement Unit etc., wherein controller can be used for the display for generating virtual writing, control and eyepiece Associated cursor serves as computer mouse, provides control command etc. by physical motion and/or contact.
In embodiments, sensor device may include optical sensor or optical launcher, be construed to order as that will move A kind of mode enabled.For example, sensor device may include the optical sensor on hand for being mounted on wearer, eyepiece shell may include Optical launcher, so that movement can be interpreted order when optical launcher of their hand of user's movement by eyepiece.It is logical Cross the movement that optical sensor detects may include at different rates, with duplicate movement, stop and mobile combination etc. into Capable brush of drawing passes through.In embodiments, optical sensor and/or transmitter can be located on eyepiece, and it is upper (such as to be mounted on wearer On hand, on foot, in gloves, on certain part clothing), or be applied in combination between different zones on wearer and on eyepiece.
In one embodiment, there are the situation for monitoring wearer or several sensings of the someone adjacent to wearer Device is installed in augmented reality glasses.Due to the progress of electronic technology, sensor has become much smaller.Reduce sum number in size In terms of wordization, signal conversion and signal processing technology have also made major progress.Therefore may not only have in AR glasses There is temperature sensor, it is also possible to which there is entire sensor array.As described, these sensors may include temperature sensor, and For detecting the sensor of following content: pulse;Heartbeat variability;EKG or ECG;Respiratory rate;Core temperature;Body heat flow; Electrodermal response, that is, GSR;EMG;EEG;EOG;Blood pressure;Body fat;Hydration level;Activity level;Oxygen demand;Glucose or blood glucose water It is flat;Body position;And UV radiant exposure or absorption.In addition, can also have retina sensor and blood oxygen transducer, (such as Sp02 is passed Sensor) etc..These sensors can be obtained from various manufacturers, the Vermed including Vermont ,Usa belotecan Si Fuersi;It is fragrant The VTI of blue Ventaa;The ServoFlow of Massachusetts, United States Lexington.
In certain embodiments, sensor is mounted on the upper or personal equipment of individual rather than may in glasses body It is more useful.For example, accelerometer, motion sensor and vibrating sensor can be usefully mounted on the clothing that individual is upper, personal On object or on the personal equipment worn.These sensors can by Bluetooth radio transmitter or follow IEEE802.11 specification Other wireless devices keep continuously or periodically contacting with the control of AR glasses.For example, doctor wishes to monitor patient The movement or concussion undergone during footrace, if sensor is directly installed on personal skin or the even personal T worn Without being mounted on glasses on sympathizing, then sensor is likely more useful.In such cases, by being placed on, individual is upper or clothes The upper rather than sensor on glasses can get more accurately reading.These sensors are not needed as being adapted to be mounted within glasses Sensor with this is small like that, as will be seen more useful.
AR glasses or goggles may also include environmental sensor or sensor array.These sensors are installed in glasses On, and near wearer atmosphere or air sampling.These sensors or sensor array can be for predetermined substance or substances Concentration is sensitive.For example, sensor and array can be used for measuring carbon monoxide, nitrogen oxides (" NOx "), temperature, relatively wet Degree, noise level, volatile organic chemicals (VOC), ozone, particle, hydrogen sulfide, air pressure, ultraviolet light and its intensity.Supplier and Manufacturer includes the Sensares of French Kroll;The CairPol of French A Laisi;Columbia Province of Britain, Canada triangle The Critical Environmental Technologies of Canada in continent city;The Apollo electronics technology of China Shenzhen Co., Ltd;The AV Technology Ltd. in Cheshire, UK stoke wave city.Many other sensors are well known.If this A little sensors are mounted on individual upper or personal clothing or equipment, then they are also likely to be useful.These environmental sensors It may include radiation sensor, chemical sensor, toxic gas sensor etc..
In one embodiment, environmental sensor, health monitoring sensor, or both are installed in augmented reality glasses On frame.In another embodiment, sensor may be mounted to that on individual upper or personal clothing or equipment.For example, for surveying The sensor for measuring the electrical activity of the heart of wearer can be implanted, the signal with the personal cardiomotility of conversion and transmission instruction Applicable attachment.
Signal can pass through bluetoothTransmitting set or other wireless devices for following IEEE802.15.1 specification pass Send very short distance.Other frequencies or agreement can also be used instead.Then signal can be supervised by the signal of augmented reality glasses Depending on handling, being recorded and being shown on the available virtual screen of wearer with processing equipment.In another embodiment, signal The friend or squad leader of wearer can be sent to through AR glasses.To, personal health and happiness can by the individual and other People can also be tracked at any time to monitor.
In another embodiment, environmental sensor may be mounted to that on the upper or personal equipment of individual.For example, if by wearing It is worn on personal coat or inner waist belt rather than is directly installed on glasses, radiation or chemical sensor are likely more useful. As described above, the signal from sensor can locally be monitored by the individual by AR eyes.Sensor reading can also be on-demand Ground is automatically transferred to elsewhere, perhaps at a set interval, such as per quart hour or per half an hour.To, The historical record of sensor reading (either the body reading or environment of the individual) can be made to be used for tracking or trend mesh 's.
In one embodiment, RF/ micropower impulse radio (MIR) sensor can be associated with eyepiece, and serves as short distance Medical radar.The sensor is operable with ultra wide band.The sensor may include at RF/ impulse generator, receiver and signal Device is managed, and for may be to detect and measure for heart signal by the ion stream for the 3mm heart myocyte for measuring skin Useful.Receiver can be phased array antenna to allow to determine the position of signal in an area of space.The sensor can quilt Heart signal is detected and identified for passing through the blocker of such as wall, water, concrete, dust, metal, wood or the like. Determine that how many people is located in concrete structure by detecting how many heart rate for example, user can be able to use the sensor. In another embodiment, the heart rate detected may act as personal unique identifier, so that they can be identified in future.One In embodiment, in the embeddable equipment of RF/ impulse generator, such as eyepiece or a certain other equipment, and receiver is embedded into In one different equipment, such as another eyepiece or equipment.It by this method, can when detecting heart rate between transmitter and receiver Creation virtual " trip wire ".In one embodiment, which is used as field diagnostic or self diagnosis tool.EKG can be divided It analyses and stores and be used as bio-identification identifier for future.User can receive the heart rate signal sensed and there are how many hearts rate Warning is as the content shown in eyepiece.
Figure 29 depicts the embodiment 2900 of augmented reality eyepiece or glasses with various sensors and signal equipment.One A or more than one environment or health sensor pass through short-distance wireless electric line and antenna is locally or remotely connected to biography Sensor interface, as shown.Sensor interface circuitry includes being examined for detecting, amplifying, handle and send and/or transmit sensor The armamentarium of the signal measured.Distance sensor may include such as implanted heart rate monitor or other body sensors (not It shows).Other sensors may include accelerometer, inclinometer, temperature sensor, be suitable for detecting one or more chemicals or Any sensor in other health or environmental sensor discussed in the sensor or the disclosure of gas.Sensor interface It is connected to the microprocessor or microcontroller of augmented reality equipment, from this point of view, the information of collection, which may be recorded in, deposits In reservoir, such as random-access memory (ram) or permanent memory, read-only memory (ROM), as shown.
In one embodiment, sensor device allows to carry out electric field sensing simultaneously by eyepiece.Electric field (EF) sensing is one Kind allows COMPUTER DETECTION, assessment and the neighbouring method for sensing to work together with the object near them.It is connect with the physics of skin Touching, for example, with another people shake hands or with other physical contacts of certain of conductive or non-conductive equipment or object, can be according to electric field In change and be sensed, and allow to transfer data to eyepiece or from eyepiece transmit data or terminate data Transmission.For example, the video that eyepiece is captured can be stored on eyepiece, the eyepiece until having embedded electric field sensing transceiver Wearer's contact object and until initiating data transmission from eyepiece to receiver.Transceiver may include transmitter and data sense Slowdown monitoring circuit, transmitter include the transmitter circuit for causing medial electric field, and transceiver passes through detection transmission data and reception Data allow two-way communication to distinguish transmitting and reception pattern and export corresponding with both of which control signal.It can be by all The contact such as shaken hands etc generates two person-to-person instantaneous dedicated networks.Data can be used in the eyepiece of a certain user and second It is transmitted between the data sink or eyepiece at family.Additional safety measure can be used to enhance the dedicated network, for example, face or Audio identification, eye contact detection, fingerprint detection, bio-identification input, iris or retina tracking etc..
In embodiments, authenticating device related with the access function of eyepiece may be present, such as go out shown by access Or the limited content projected of the content that projects, access, the function of eyepiece itself is fully or partially enabled (as by stepping on Land accesses the function of eyepiece) etc..Certification can pass through speech, iris, retina, fingerprint to wearer etc. or other biological The identification of identification marking symbol provides.For example, eyepiece or associated controller can have IR, ultrasonic wave or capacitive character tactile to pass Sensor, for receiving and authenticating or the related control of other eyepiece functions inputs.Capacitance sensor can detect fingerprint, and starts and answer With or otherwise control a certain eyepiece function.Each finger has different fingerprints, therefore each finger can be used for controlling Different eyepiece function or quick start different application provide the certification of various ranks.Capacitor cannot work together with gloves, But ultrasonic sensor can be with, and can be utilized to provide biometric authentication or control in an identical manner.In eyepiece or phase Useful ultrasonic sensor includes the SonicSlide of Sonavation in associated controllerTMUsed in sensor The SonicTouch of SonavationTMTechnology, it is by measuring burr and the dimpled grain of fingerprint acoustically come with 256 grades of gray scales pair Fingerprint imaging carrys out work to distinguish least details in fingerprint.SonicSlideTMThe crucial image-forming assembly of sensor is by ceramics Ceramic microelectronic mechanical system (MEMS) piezoelectric transducer array made of composite material.
Verification System can provide the bio-identification input database of multiple users, so that every in database based on being input to The strategy of a user provides the access control using eyepiece with associated access privileges.Eyepiece can provide verification process.Example Such as, authenticating device can sense when user removes eyepiece, and re-authentication is required when user puts on eyepiece again.This is more preferable Ground ensures eyepiece only to those of being authorized to user and only wearer those of is authorized to privilege and provides access.Show one In example, authenticating device can detect the eyes of user or the presence on head when eyepiece is worn.In first order access, use Family can only be able to access that low sensitive items, until certification is completed.During certification, authenticating device can identity user, and search him Access privileges.Once these privileges are determined, authenticating device can then provide a user suitable access.It is detecting In the case where unauthorized user, eyepiece can keep the access to low sensitive items, further limitation accesses, refusal completely is visited It asks.
In one embodiment, receiver can be associated with an object right to this by the touch of the wearer of eyepiece to allow As being controlled, wherein touching allows command signal to transmit or execute in object.For example, receiver can be with car door locking phase Association.When the wearer of eyepiece touches automobile, car door can be unlocked.In another example, receiver can be embedded into medicine bottle. When the wearer of eyepiece touches medicine bottle, caution signal can be initiated.In another example, receiver can be with the wall along pavement It is associated.When the wearer of eyepiece is by wall or touches wall, can start in eyepiece or in the panel of videos of wall wide It accuses.
In one embodiment, it when the wearer of eyepiece initiates physical contact, can be mentioned with the WiFi information exchange of receiver It is connected to the instruction of the online activity of such as game for wearer, or may be provided in authentication in thread environment.In the reality It applies in example, in response to the contact, the expression of the individual can be changed color or be subjected to certain other be visually indicated.
In embodiments, eyepiece may include such as the haptic interface in Figure 14, such as to enable the tactile control to eyepiece System, such as by drawing brush, light button, touch, pressing, click, the rolling of spin etc..For example, haptic interface 1402 may be mounted to that On the frame of eyepiece 1400, for example, in a temple, in two temples, the top of the bridge of the nose, frame, the bottom of frame etc..Each In embodiment, haptic interface 1402 may include similar to the computer mouse, all 2D as described herein with left and right buttons Set the control and function of control panel etc..For example, haptic interface may be mounted to that on eyepiece on the temple of user, and serve as pair Eyepiece is projected to " temple mouse " controller of the content of user, and may include the rotary selector being mounted on temple and carriage return Button.In another example, haptic interface can be one or more vibration temple motors, can vibrate to alert or notify to use Family, such as left side danger, the right danger, medical conditions etc..Haptic interface may be mounted to that on the controller separated with eyepiece, example The controller of such as wearing, the controller carried on hand.If there is accelerometer in controller, it can sense user's tapping, Such as on keyboard, they on hand (with controller hand wound or with have controller hand tapping) etc..It wears Then person can be construed to the various ways of order to touch the haptic interface, such as by primary or more on interface by eyepiece Secondary tapping, by by the swiped through interface of finger, by pressing and keeping, by once pressing more than one interface etc..In each reality It applies in example, haptic interface can be attachable to the body (such as their hand, arm, leg, trunk, neck) of wearer, their clothing, work For the attachment of their clothings, as finger ring 1500, as bracelet, as necklace etc..For example, the interface can be affixed to body On, such as at wrist back, wherein the different piece for touching the interface provide different command informations (such as touch front, Rear portion, centre, holding a period of time, tapping, stroke brush etc.).In embodiments, contact of the user with haptic interface can pass through Power, pressure, movement etc. are explained.For example, haptic interface is in combination with resistive touch technology, capacitance touch technology, ratio pressure Power touching technique etc..In one example, in the case where the application requirement interface is simple, durable, low-power etc., Haptic interface can utilize discrete resistance touching technique.In another example, in the case where more multi-functional by the interface requirement (such as passing through movement, stroke brush, multiple point touching etc.), which can utilize capacitive touch technology.In another example, it touches Feel that interface can utilize pressure touch technology, such as when requiring variable pressure order.In embodiments, in these touching techniques Any touching technique or similar touching technique can all be used in any haptic interface as described herein.
In one embodiment, hand-held attachment can be used for controlling dummy keyboard to be input to eyepiece.For example, if hand-held set Standby to have touch screen, then user touch screen or can be presented on-screen keyboard or be adapted to allow user with touch screen interaction With equipment interaction (it and dummy keyboard are coordinated to provide input to glasses).For example, the dummy keyboard can be present in glasses, But not selects project in the sky, but user enables to touch panel device to be suitable for receiving to correspond to the defeated of dummy keyboard Enter.The equipment can track finger when finger slips over capacitive module, will provide keystroke to the click of equipment and feel.Equipment can be Front has touch-surface, has one or more Action Buttons later or above, allows user to click to be chosen without and need By their finger lift-off touch-surface.The letter that user selected can be highlighted.User can still carry out cunning and refer to text Input lifts their finger to terminate a certain word, is inserted into space, double to kowtow to be inserted into fullstop etc..Figure 159 depicts user's view The dummy keyboard 15902 that Yezhong is presented.On the keyboard, two keys are highlighted, ' D ' and ' Enter ' (carriage return).It is attached at this In figure, touch screen accessory device 15904 is used to the input being supplied to keyboard, is then transferred to glasses as input. The view of input or control command is performed using the practical touch screen on virtual interface or external equipment to provide and indicate Feel indicator.
In embodiment, eyepiece may include using magnetic field come between eyepiece and external equipment transmit and/or receive order, Telemetering, information etc., or the haptics communications interface of order, telemetering, information etc. is directly transmitted or received from user to user.For example, User can have the body for being directly laid in them a certain position (such as skin, nail, in body) patterned magnetic material Material, the oscillating magnetic field which generates haptics communications interface make response (such as vibration, power, fortune physically It moves).The oscillating magnetic field can convey information by the modulation of field, such as by the amplitude of signal, signal it is time-related Difference, frequency of signal etc..The information of reception and registration can be alarm, incoming call instruction, for entertaining, for communicating, with eyepiece application It is associated instruction, be used to indicate user and eyepiece the degree of approach, for providing a user touch feedback etc. from eyepiece.It is different Order can cause different stimulating effects to patterned magnetic material, for different orders or indicator.For example, use can be passed through The different frequency and/or sequence pattern of the incoming call of different people in the contacts list of user, for different warning levels Varying strength, interesting mode for entertainment purposes etc. realize different stimulating effects.
Haptics communications interface may include transmission and/or the coil for receiving oscillation magnetic signal.Magnetic material can be ferromagnetic material Material, paramagnetic material etc., and can according to power supply, ink, tatoo, applique, adhesive tape, transfer paper, spraying etc. apply.In each embodiment In, magnetic material can have in user not when using eyepiece demagnetization, in the magnetic field that magnetic material is not present in from eyepiece when The abilities such as unmagnetized.It can be with functional spatial model come the magnetic material to be applied, such as in response to specific Signal of communication modulation has specific impedance, in response to specific frequency etc..The magnetic material of application can be visible figure As, invisible image, tatoo, mark, label, symbol etc..The magnetic material of application may include one mode, which utilizes Incoming magnetic signal returns to the transmission signal (such as band is about identifier of user) of eyepiece haptics communications interface to generate, and makees For the signal etc. for indicating the degree of approach between eyepiece and magnetic material.For example, identifier can be User ID, the User ID with The ID stored on eyepiece compares for confirming that the user is the authorized user of eyepiece.In another example, magnetic material can be only The transmission signal for returning to eyepiece can be only generated in the case where magnetic material is close to eyepiece.For example, user can be by magnetic material It is applied to nail, and user nearby can provide order instruction to eyepiece by the way that their finger is put into user's haptic interface Device.
In another example, wearer can have the interface being installed in finger ring as shown in figure 15, handpiece etc., wherein The interface can have and have at least one of multiple command interface types for connecting of wireless command with eyepiece, as haptic interface, Positional sensor devices etc..In one embodiment, finger ring 1500 can have the control of mapping calculation machine mouse, such as button 1504 (such as playing single button, more buttons and similar mouse function), 2D position control 1502, idler wheel etc..1504 He of button 2D position control 1502 can be as shown in figure 15, and wherein button is located at the side towards thumb, and 2D positioner is located at top. Alternatively, other configurations can be used in button and 2D position control, such as all towards thumb side, be entirely located in top surface or any Other combinations.2D position control 1502 can be 2D button position controller and (such as be embedded in the keyboard of certain laptop computers In be used to control TrackPoint (TrackPoint) pointing device of position of mouse etc.), TrackPoint, control stick, optical tracking pad, Photoelectricity wheel trolley, touch screen, touch tablet, Trackpad, rolling Trackpad, trace ball, any other position or position control device etc.. In embodiments, the control signal from haptic interface (such as finger ring haptic interface 1500) can be provided with wired or wireless interface To eyepiece, wherein user can easily provide control input with their hand, thumb, finger etc..In embodiments, Finger ring perhaps can be expanded to adapt to any finger or shrink more close hand.For example, finger ring can have customized restraint strap or The hinge of spring is installed.For example, user perhaps can clearly express control with their thumb, wherein finger ring is worn on user Index finger on.In embodiments, the interactive mode that a kind of method or system can provide user's wearing wears eyepiece, wherein the eyepiece Including user observed by it ambient enviroment and the content shown optics assembly, for process content for display to The processor of user, the integrated projector apparatus for content to be projected to optics assembly and user body (such as with The hand at family) on the control equipment worn, which includes at least one control assembly motivated by user, and will be originated from should The control command of the excitation of at least one control assembly is supplied to processor as command instruction.Command instruction can be for will show Show to the manipulation of the content of user.Control equipment can be worn on the first finger of user hand, and at least one control assembly It can be motivated by the second finger of user hand.First finger can be index finger, and second finger can be thumb, and the first finger With second finger can be located at user it is same on hand.Control equipment can have at least one be mounted on the side of the index finger towards thumb A control assembly.At least one control assembly can be button.At least one control assembly can be 2D positioner. The control assembly that control equipment can have at least one button being mounted on the side of the index finger towards thumb to motivate, and be mounted on The control assembly towards the 2D positioner excitation in top side of index finger.Control assembly may be mounted to that user hand at least On two fingers.Control equipment can be used as gloves to be worn on user on hand.It is wearable on the wrist of user to control equipment. At least one control assembly can be worn at least one finger of hand, and transmission equipment can be individually worn on hand On.Transmission equipment can be worn on wrist.Transmission equipment can be worn on the back of the hand.Control assembly can be in multiple buttons At least one.At least one button can provide the function of being substantially similar to conventional computer mouse button.In multiple button Two main buttons that can play a part of to be substantially similar to conventional double-button computer mouse.The control assembly can be rolling Wheel.The control assembly can be 2D position control component.The 2D position control component can be button position controller, give directions Bar, control stick, optical tracking pad, photoelectricity wheel trolley, touch tablet, Trackpad, rolling Trackpad, trace ball, capacitive touch screen etc.. The thumbs of the 2D position control component available subscribers controls.The disposable control assembly can be that by including by The touch screen of touch control including button class function and 2D operating function.The control assembly can will pinpoint and control equipment in user It is motivated when being placed in the processor content projected.Circular finger controls can be by can drop, rechargeable, solar energy etc. On-board battery power supply.
In embodiments, wearer can have the interface being mounted in finger ring 1500AA, which includes camera 1502AA, as shown in Figure 15 AA.In embodiments, circular finger controls 1502AA can have control interface class as described herein Type, such as pass through button 1504,2D position control 1502,3D position control (as utilized accelerometer, gyroscope).Finger ring control Device 1500AA such as controls the manipulation of the display content of opposite wearer's projection in can be used for the function in control eyepiece.? In each embodiment, control interface 1502,1504 such as ON/OFF, zoom, can shake in terms of the camera 1502AA of insertion provides control Take the photograph, focus, recording still image photo, record video etc..It alternatively, can be all to control by other control aspects of eyepiece Function, such as pass through voice control, other Tactile control interfaces, eye-gaze detection as described herein.Camera can also open With automatic control function, such as auto-focusing, timing function, face detection and/or tracking, autozoom etc..For example, having The circular finger controls 1500AA of integrated camera 1502AA can be used for checking wearing during the video conference started by eyepiece Person 1508AA, wherein the extended circular finger controls of wearer 1508AA (such as being mounted on their finger) are to allow camera The facial view that 1502AA obtains them supplies to send at least one other participant of video conference to.Alternatively, wearer can Remove circular finger controls 1500AA and be lowered into surface 1510AA(such as table top surface) on, so that camera 1502AA sees pendant Wearer.The image of wearer 1512AA is then displayed on the display area 1518AA of eyepiece, and is transmitted to video council View other people, such as the image 1514AA of other participants together with conference call.In embodiments, camera 1502AA can provide manually or automatically FOV(visual field) 1504AA adjusting.For example, wearer can put down circular finger controls 1500AA For being used in conference call on to surface 1510AA, and FOV1504AA can by manually (as by button control 1502, 1504, voice control, other haptic interfaces) or automatically (such as passing through face recognition) controls the FOV1504AA to make camera It is directed toward the face of wearer.FOV1504AA is aloowed to change as wearer is mobile, such as by via face recognition Tracking.FOV1504AA can also amplify/reduce to be adapted to the variation of the facial positions of wearer.In embodiments, Camera 1502AA can be used for a variety of static and/or Video Applications, wherein display area 1518AA of the visual field of camera in eyepiece On be provided to wearer, memory, which may be present in, to be used to store image/video in eyepiece, and image/video can be turned from eyepiece It moves, be communicated to some External memory equipment, user, web application etc..In embodiments, camera can be incorporated into it is multiple not With mobile device in, such as be worn on arm, on hand, on wrist, on finger etc., such as have as shown in Figure 32 to 33 The wrist-watch 3202 of embedded type camera 3200.As circular finger controls 1502AA, any one of these mobile devices all may be used Including for the manual and/or automatic function as described in circular finger controls 1502AA.In embodiments, circular finger controls 1502AA can have the function of additional sensor, insertion, controlling feature etc., such as fingerprint scanner, touch feedback, LCD screen, Accelerometer, bluetooth etc..For example, circular finger controls can provide the synchronization monitoring between eyepiece and other control assemblies, such as originally Described in text.
In embodiments, eyepiece can provide a kind of for providing pendant to video conference participants by using external mirror The system and method for the image of wearer, wherein wearer sees themselves in reflecting mirror, and themselves image passes through The integrated camera of eyepiece is captured.The image of capture can be used directly or image can be reversed to correct the figure of reflecting mirror As reversion.In one example, wearer can be added with other people multiple video conferences, and wherein wearer perhaps can pass through Eyepiece watches other people real time video image.By the way that using the integrated camera in general mirror and lens, user is perhaps Themselves can be seen in mirror, made image be integrated cameras capture and provided them to other people for video conference The image of oneself.Other than such as other people image involved in the video conference, which also be can be used as to eyepiece Image is projected to be obtained by wearer.
In embodiments, can also be provided can provide surface sensory package in control equipment is used to detect across surface The control assembly of movement.The surface sensory package can be placed in the palmar side of user hand.Surface can be hard surface, pressure release surface, At least one of the skin surface of user, garment surface etc. of user.Wirelessly, by transmission such as wired connections it can provide control System order.Controlling equipment can control fixed point function associated with the processor content shown.The fixed point function can be pair The control of cursor position;To the selection of the content shown, selection and the mobile content shown;Change to the content shown Coke, panning, the visual field, size, the control of position etc..Controlling equipment can control fixed point associated with the ambient enviroment checked Function.The fixed point function, which can be, is placed on cursor on object what is observed in ambient enviroment.The positioning for the object observed Position can combine the camera integrated with eyepiece by processor to determine.The mark for the object observed can be combined by processor and mesh Mirror integrated camera determines.Controlling equipment can control certain function of eyepiece.The function can be associated with the content shown. The function can be the scheme control of eyepiece.Control equipment can be it is folding, convenient for the storage when user does not wear.Each In embodiment, control equipment can be used together with external equipment, such as jointly controlling external equipment with eyepiece.Outside is set It is standby to can be amusement equipment, audio frequency apparatus, portable electronic device, navigation equipment, weapon, automatic controller etc..
In embodiments, body worn control equipment (such as be worn on finger, at palm bondage to hand, On arm, on leg, on trunk etc.) 3D position sensor information can be provided to eyepiece.For example, control equipment may act as " aerial mouse Mark ", wherein 3D position sensor (such as accelerometer, gyroscope etc.) is in user command (such as by clicking button, voice life Enable, the posture of vision-based detection etc.) when location information is provided.User is able to use this feature perhaps to navigate to project by eyepiece and be System is projected to the 2D or 3D rendering of user.Further, eyepiece can provide the external of image and relay, for showing or being projected to other People, such as in the case where demonstration.User perhaps can change the mode of control equipment between 2D and 3D, to be adapted to not Same function, application, user interface etc..In embodiments, multiple 3D control equipment can be used for certain applications, as emulation is answered In.
In embodiments, a kind of system can include: the interaction wear-type eyepiece that user wears, wherein eyepiece includes user The optics assembly of ambient enviroment and the content shown is observed by it, wherein the optics assembly includes correcting user pair The correcting element of the view of ambient enviroment;For process content for display to the integrated processor of user;And it is used for content Introduce the integrated image source of optics assembly;And it is mounted on the Tactile control interface on eyepiece, which passes through User contacts the interface and user is located at interface at least one in the vicinity to receive control input from the user.
In embodiments, can enable the control to eyepiece by hand control, and especially with the content that shows The control of associated cursor, such as using wearable device 1500 shown in figure 15, virtual machine mouse as shown in figure 15 a Mark 1500A etc..For example, wearable device 1500 can transmit order by physical interface (such as button 1502, idler wheel 1504), and empty Quasi- computer mouse 1500A perhaps can explain order by the movement and movement of thumb, fist, the hand of detection user etc..? In calculating field, physics mouse is the pointing device acted on by detecting the two dimensional motion relative to its support surface.Physics The object and one or more buttons that mouse is traditionally held by the subordinate of user form.It is with other elements sometimes Feature such as allows user to execute " idler wheel " of the various operations depending on system, or can add more controls or dimension input Additional buttons or feature.The movement of mouse is converted into the movement of the cursor on display, this allows to graphic user interface Precise controlling.In the case where eyepiece, user perhaps can utilize the combination of physics mouse, virtual mouse or both.In each reality It applies in example, virtual mouse can be related to the one or more sensors for the hand for being attached to user, such as in thumb 1502A, finger On 1504A, palm 1508A, wrist 1510A etc., wherein eyepiece receives the signal from sensor and turns the signal received Change the movement of cursor on user's eyepiece displayer into.In embodiments, the outside of such as haptic interface 1402 etc can be passed through Interface, by receiver on the inside of eyepiece, at secondary communication interface, in associated physics mouse or wear on interface Etc. receiving signal.Virtual mouse may also include the element of the driver or other output types that are attached to the hand of user, such as use Touch feedback is provided a user in passing through vibration, power, pressure, electric pulse, temperature etc..Sensor and driver can by housing, Finger ring, protector, gloves etc. are attached to the hand of user.In this way, eyepiece virtual mouse allows user that the movement of hand is converted into mesh The movement of cursor on mirror display, wherein " moving " may include move slowly at, quickly movement, wriggling, position, changing in position Become etc., and user is allowed to carry out work with three-dimensional, without physical surface and including some or all of 6 freedom degrees.Note that Since " virtual mouse " can be associated with the multiple portions of hand, virtual mouse can be implemented as multiple " virtual mouse " controllers, Or the distributed director of multiple control members across hand.In embodiments, eyepiece can provide the use of multiple virtual mouses, Such as on every of user one on hand, one of user or multi-feet etc..
In embodiments, eyepiece virtual mouse may not be needed physical surface to operate, and such as add for example, by a variety of Speedometer-type (such as tuning fork, piezoelectricity, shearing mode, strain mode, capacitive character, heat, resistive, electromechanical, resonance, magnetic, optics, sound Sound, laser, three-dimensional etc.) one of sensor detect movement, and determine by the output signal of sensor hand or hand The translation of certain a part or angle displacement.For example, accelerometer can produce the translational acceleration on three directions of its size and hand Spend proportional output signal.Pairs of accelerometer can be configured to detect the rotary acceleration of each section of hand or hand.It can The translational velocity and displacement for determining each section of hand or hand by integrating to accelerometer output signal, and can be by acceleration Difference-product between the output signal of degree meter pair divides rotation speed and displacement to determine hand.Alternatively, using other sensors, Such as ultrasonic sensor, imager, IR/RF, magnetometer, gyro magnetometer.Since accelerometer or other sensors can be pacified In each section of hand, eyepiece is perhaps able to detect a variety of movements of hand, from usually associated with computer mouse movement The more highly complex movement of the simple motion explanation of hands movement complicated into such as Simulation Application etc.In each embodiment In, small translation or spinning movement can only be needed by user so that these movements are converted into and in the eyepiece projection of user User's intention acts associated movement.
In embodiments, virtual mouse can have physical switch for controlling devices associated therewith, such as install Off/on switches on other positions of hand, eyepiece or body.Virtual mouse can also have the predefined fortune by hand What dynamic or movement carried out opens/closes control etc..For example, the behaviour of virtual mouse can enable by quickly moving back and forth for hand Make.In another example, the movement (such as before eyepiece) of eyepiece can be crossed by hand to disable virtual mouse.In each embodiment In, the virtual mouse for eyepiece can provide multiple explanations for moving to and usually controlling associated operation with physics mouse, such as This is just familiar for user without training, such as is clicked, double-clicked with finger, three hit, right click, left click, click and dragged, combined spot It hits, roller motion etc..In embodiments, eyepiece can provide gesture recognition, such as explain gesture by mathematical algorithm.
It in embodiments, can be by first from the conductor as a part of the control system of eyepiece using the hand because of user The technology that capacitive character changes caused by variation in the distance of part identifies to provide ability of posture control, therefore will be not necessarily to user's Any equipment is installed on hand.In embodiments, conductor can be used as eyepiece a part be mounted, such as frame temple or In other parts, or as certain external interface on the body or clothes that are mounted on user.For example, conductor can be antenna, Wherein control system works in a manner of similar to the contactless musical instrument of referred to as Te Leimenqin.Te Leimenqin uses heterodyne principle Audio signal is generated, but in the case where eyepiece, which can be used for generating control input signal.Control circuit can wrap Include several radio-frequency oscillators, such as an oscillator with fixed frequency work and another by user hand control, wherein with hand Distance change control antenna at input.In the art, the hand of user serves as L-C(inductor-capacitor) it can power transformation in circuit The earth plate (body of user is grounded) of container, it is a part of oscillator and determines its frequency.In another example, circuit Single oscillator, two pairs of heterodyne oscillators etc. can be used.In embodiments, multiple and different conductors may is used as control defeated Enter.In embodiments, such control interface inputs (such as volume control, change for the control changed across a certain range Coke control etc.) for may be ideal.However, such control interface can also be used for more discrete control signal (such as ON/OFF control), wherein predetermined threshold determines that the state of control input changes.
In embodiments, eyepiece can pad mouse, hand-held remote controller, the remote control installed on body with such as radio tracking The physical remote control device of device, the remote controler being mounted on eyepiece or the like docks.Remote control equipment may be mounted to that external one In equipment, such as personal use, game, profession use, military etc..For example, remote controler can be installed in the weapon of soldier On, such as be mounted on pistol grip, be mounted on muzzle shield, be mounted on foregrip, thus provide remote control to soldier and It does not need their hand being moved away from weapon.Remote controler can be removably attachable to eyepiece.
In embodiments, it can be activated and/or control by proximity sensor for the remote controler of eyepiece.It is neighbouring to pass Sensor can be the existing sensor that neighbouring object is just able to detect without physical contact.For example, proximity sensor is capable of emitting Electromagnetism or electrostatic field or electromagnetic radiation beam (such as infrared ray), and find the variation in field or return signal.Sensed object The commonly referred to as target of proximity sensor.Different proximity sensor targets may need different sensors.For example, capacitor Property or photoelectric sensor are likely to be suited for plastic target;Inductive proximity sensor requires metal target.Proximity sensor technology Other examples include capacitive displacement transducer, vortex, magnetic, photocell (reflection), laser, passive thermal infrared, passive optical, CCD, reflection of ionising radiation etc..In embodiments, proximity sensor can be integrated in any control embodiment as described herein In, including physical remote control, virtual mouse, the interface being mounted on eyepiece, being mounted on an external equipment, (such as game controls Device, weapon) on control etc..
In embodiments, can be used for controlling eyepiece for measuring the sensor of the body kinematics of user, or as outer Portion's input, such as use Inertial Measurement Unit (IMU), 3 axis magnetometers, 3 axis gyroscopes, 3 axis accelerometers etc..For example, one passes Sensor may be mounted to that user on hand, to allow to control eyepiece using the signal from sensor, as described herein.? In another example, sensor signal can be received and be explained by eyepiece to assess for the purpose except controlling and/or using using The body kinematics at family.In one example, the sensor being mounted on the every leg and every arm of user can provide letter to eyepiece Number come allow eyepiece measurement user gait.Then the gait of user transfers the step that can be used for monitoring that user changes over time State, for the progress during monitoring the variation of physical behavio(u)r, physical therapy, variation etc. as caused by head trauma.? In the example for monitoring head trauma, eyepiece can initially determine that the baseline gait profile of user, then monitor user at any time, all Such as before and after physical event (such as related collision, explosion, car accident with movement).It is in sportsman or individual In the case where in physical therapy, eyepiece can be used for the gait for periodically measuring user, and safeguard measured value in the database For analysis.It can produce gait time profile of running, such as indicate physical trauma, physics progress for monitoring the gait of user Deng.
In embodiments, the facial characteristics of the user of eyepiece can be worn by sensing through facial stimulus sensor 1502B Movement, the tension of facial muscles, the movement of the fastening of tooth, chin etc. initiate the control to eyepiece, especially to it is aobvious Show to the control of the associated cursor of content of user.For example, as shown in fig. 15b, eyepiece can have facial stimulus sensor to make For the extension from eyepiece earphone assembly 1504B, extension of temple 1508B from eyepiece etc., septum reset stimulus sensor can Sense power associated with the movement of facial characteristics, vibration etc..Facial stimulus sensor can also be installed separately with eyepiece assembly, Such as a part of separate headset, wherein the sensor output of earphone and facial stimulus sensor can be by wired or wireless Communication (such as bluetooth or other communication protocols known in the art) is sent to eyepiece.Facial stimulus sensor also can be attachable to ear Around piece, in mouth, on the face, on neck etc..Facial stimulus sensor can also be made of multiple sensors, such as optimizing difference Face or internal motion or the movement of the sensing of movement.In embodiments, the detectable movement of facial stimulus sensor and by they It is construed to order or original signal may be sent to that eyepiece for explaining.Order can be order for controlling eyepiece function, Control associated with the cursor from content to a part of the display of user or pointer that are provided as etc..For example, user can fasten Their tooth indicates such as usually associated with the click of computer mouse to click or double-click once or twice.Another In example, user can tense facial muscles to indicate to order, selection such as associated with the image of projection.In each embodiment In, facial stimulus sensor can minimize face, first-class background motion using noise reduction process, such as by self-adapting signal at Reason technology.Speech activity sensor can also be used to reduce for example from user, from other neighbouring individuals, from surrounding ring The interference of border noise etc..In one example, facial stimulus sensor can also pass through the vibration during detection speech in user's cheek To improve communication and eliminate noise, ambient noise is such as identified with multiple microphones and be increased by noise elimination, volume Etc. eliminating ambient noise.
In embodiments, the user of eyepiece perhaps can be put into the visual field of eyepiece and be directed toward by lifting their hand Object or position obtain the information about certain environmental characteristic, place, the object observed by eyepiece etc..For example, user The finger for making direction can indicate an environmental characteristic, wherein the finger is not only in the visual field of eyepiece, but also is being embedded in In the visual field of formula camera.System perhaps can will make now the finger of direction position and camera seen in environmental characteristic field Institute is related.In addition, eyepiece can have position and orientation sensor, such as GPS and magnetometer, to allow the field of system aware user Institute and sight.System perhaps can extrapolate the location information of the environmental characteristic as a result, such as providing a user place letter Breath, the position of environmental information is superimposed upon on 2D or 3D map, location information that further association is established is by the location information With auxiliary information (name of the individual of such as address, the address, the business organization name in the place, the place about the place Coordinate) correlation etc..With reference to Figure 15 C, in one example, user is passing through eyepiece 1502C and is seeing and referred to their hand 1504C House 1508C into their visuals field, wherein embedded type camera 1510C existing hand 1504C for making direction in its visual field There is house 1508C.In this example, system can determine the place of house 1508C, and provide addition in user to the environment Locale information 1514C and 3D map on view.In embodiments, information associated with environmental characteristic can be by external equipment It provides and (connects by wireless communication to communicate), be stored in inside eyepiece and (download to eyepiece for current place etc. Deng).In embodiments, it may include related with scene observed by wearer a variety of for being supplied to the information of the wearer of eyepiece Any information in information, such as geography information, interest point information, social networking information (such as with station in front of wearer People is related enhance around the people push away the information such as special (Twitter), facebook (Facebook), such as " suspension " in the people's week Enclose), profile information (such as being stored in the contacts list of wearer), historical information, consumption information, product information, retail Information, security information, advertisement, business information, security information, information related with game, humour annotation, letter related with news Breath etc..
In embodiments, user perhaps can control their relative to 3D projection image visual angle, such as with external rings The associated 3D projection image in border, the 3D for being stored and being retrieved project the film (such as downloading to watch) of image, 3D display Deng.Such as and referring again to Figure 15 C, user perhaps can such as change the image 1512C of 3D display by rotating their head Visual angle, and together with being maintained when the image of 3D display rotates their head even if in user in real-time external environment In the case of they mobile position etc..By this method, eyepiece perhaps can pass through the observing in user by information superposition Augmented reality, such as map 1512C, the Locale information 1514C of 3D display of superposition etc. are provided in external environment, wherein showing Map, information etc. can change with the change of the observation of user.In another example, in 3D film or the electricity converted through 3D In the case where shadow, can by changing the visual angle of spectators to certain control of viewing visual angle so that spectators " entrance " film environment, Wherein user perhaps can rotate their head and correspondingly change view with the head position of change, and wherein user perhaps can It is enough " to be entered into " in image when they physically go ahead, with their mobile the watching view attentively of eyes of user visual angle is changed Become etc..Furthermore, it is possible to provide additional image information, such as in each side of User, this can be accessed by rotation head.
In embodiments, the user of an eyepiece perhaps can by its to projection image or video view at least with certain The view of another user of one eyepiece or other video display apparatus is synchronous.For example, two independent eyepiece users may want to Identical 3D map, game projection, point of interest projection, video etc. are watched, two of them spectators not only see identical projection Content, and the view for projecting content is synchronized therebetween.In one example, two users may wish to jointly check Some region of 3D map, which, which is synchronized into, enables a user perhaps to can point at another user on the 3D map The position seen and interacted.The two users are with perhaps can moving around on 3D map and share two users and 3D Virtual-physical interaction between figure etc..Further, one group of eyepiece wearer perhaps can hand over a projection jointly in groups Mutually.By this method, two or more users perhaps can be synchronized by the coordinate of their eyepieces to have unified enhancing existing Entity is tested.The synchronization of two or more eyepieces can be provided by conveying location information between eyepiece, such as absolute position Information, relative position information, translation and rotary position information etc., such as from position sensor (such as gyro as described herein Instrument, IMU, GPS etc.).Can by internet, by cellular network, by satellite network etc. come the communication between guiding ocular.It is right The processing for facilitating synchronous location information can executed in the primary processor of single eyepiece, jointly held between one group of eyepiece Row, execution etc. or their any combination in remote server system.In embodiments, the projection between multiple eyepieces The view of the synchronization of the coordinatograph of content can provide to be experienced from an individual to the augmented reality of the extension of multiple individuals, wherein should Multiple individuals benefit from this group of augmented reality experience.For example, the people of one group of frequent concert can by their eyepiece with come from The feeding of concert production side is synchronous so that visual effect or audio can by concert production side, performing artist, other audiences at Member etc. is pushed to the people with eyepiece.In one example, performing artist can have argument mirror, and can control into audience member's transmission Hold.In one embodiment, content can be performing artist to the view of ambient enviroment.Argument mirror may be also used in performing artist For various applications, such as control exterior lighting system interacts with augmented reality jazz drum or collection plate, recalls the lyrics etc..
In embodiments, the image or video shown on eyepiece can (it has logical with eyepiece with the equipment that is connected Letter link) on showing or image or view captured by the equipment that this is connected or the feeding directly from remote camera Frequency is synchronous.Feed can be selected or another movement can be inputted by the sensor that one of the equipment from being connected receives or control Signal processed, the metadata sent by one of other equipment being connected etc. are initiated.Other video display apparatus can be other mesh Mirror, desktop computer, laptop computer, smart phone, tablet computer, television set etc..Eyepiece, equipment and remote camera It can be connected by wide area, local, metropolitan area, a domain and cloud network communication link.It is defeated that sensor input can be audio sensor Enter, video sensor input etc..Other movements that can input or control signal by receiving sensor to initiate may include initiating It such as tracks target, send message or the movement such as initiation audio video synchronization described elsewhere herein or the like.For example, by remote The video that the eyepiece of guard at journey inspection station or screening place is captured can be applied in face recognition and be fed from guard's eyepiece Video in selected when identifying interested people automatically, for being shown on the eyepiece of administrator.
In embodiments, eyepiece can realize the audio direction of the wearer of eyepiece, such as benefit using sound projective technique With around audio technology.The realization of the audio direction of wearer may include that sound is reproduced from source direction (in real time or as playback). It may include vision or audible indicator to provide the direction of sound source.Sound projective technique is for having such as since user is subjected to Hearing impairment, user wear earphone, user wears for hearing defect caused by hearing protection etc. or the individual of obstruction and may be Useful.In this example, eyepiece can provide the audible reproduction of 3D of enhancing.In one example, wearer may have put on earphone, And have been carried out shooting.In this example, eyepiece perhaps can reproduce the 3D acoustic profile of shot, to allow wearer couple Shooting makes a response, it is known that sound comes wherefrom.In another example, with earphone, hearing loss, in noisy environment etc. Wearer, which does not perhaps can say, is saying what and/or just in the direction of talker, but provides the enhancing of 3D sound from eyepiece (for example, wearer is passing through the individual that earphone listens other close, therefore there is no directivity information).In another example, it wears Wearer is likely to be in noisy ambient enviroment, or in the environment in the noisy noise of possible generating period.In this example, Eyepiece can have the ability or sound of cutting off noisy sound to protect the hearing of wearer that may ring very much so that wearer not It cans say sound to come wherefrom, and their ear may be very loud so that they do not hear anything now.In order to help Such case, eyepiece can provide the queues such as vision, the sense of hearing, vibration to wearer to indicate the direction of sound source.In each implementation In example, in the case where the ear of wearer is plugged so that their ear is from noisy noise, eyepiece can provide " enhancing " hearing, but using earplug the reproduction of sound is generated to substitute and those of miss sound from natural world.This artificial sound Directionality is given in then communication that sound can be used for the wireless transmission that cannot be heard naturally to operator.
In embodiments, it can be the point difference of different directions for establishing the example of the configuration of the directionality of source sound Microphone.For example, at least one microphone can be used for the sound of wearer, at least one microphone is used for ambient enviroment, at least one Downwardly directedly, possibly it is in multiple and different discrete directions.In this example, the microphone under being directed toward can be taken away it is isolated His sound, this can surround with 3D sound and enhance hearing technical combinations, as described herein.
In example of the voice enhancement system as a part of eyepiece, there are several users with eyepiece, such as In noise circumstance, wherein all users " block ear ", as realized by the progress man-made noise barrier of eyepiece earplug.It wears One of wearer may shout that they need certain part equipment.By the hearing conservation that all ambient noises and eyepiece are created, do not have People can hear the request to the equipment.Here, the wearer for making oral request has filtering microphone near their mouth, they The request can be wirelessly communicated to other people, wherein their eyepiece can by voice signal relay to other users eyepiece with And the ear of correct side, other people will be appreciated by the right or eye left to look at that who has made the request.The system can be with all The geographical location of wearer and " virtual " ambiophonic system further enhance, and " virtual " ambiophonic system uses two earplugs To give 3d space perception (the true loop technique of such as SRS).
In embodiments, sense of hearing queue is also possible to computer generation, therefore the user communicated does not need Their communication out, but can be selected from commonly used command list, it is logical that computer generates this based on preconfigured condition etc. Letter.In one example, wearer is likely to be at such a case, and wherein they are not desired to have display before their eyes Device, but earplug is want to be placed in their ear.In this case, if they want group of notifications in someone get up and follow They, they can just click controller specific times, or provide visual gesture with camera, IMU etc..System may be selected " and then I " it orders and other users will be sent to together with its 3D system place with the user for making communication, to lure into them from reality It listens in the upper place beyond the invisible positioned at them.It in embodiments, can be true by the location information of the user from eyepiece Fixed and/or offer directional information.
Eyepiece may include for providing a user palmesthetic equipment, such as by the frame of goggle structure or temple Vibration exciter (such as by mechanical oscillation motor, piezoelectric vibration driver, ultrasonic vibration driver etc.).Vibration can be mentioned For indicating to the user that outbound message indicates, instruction as the user to visual impairment (such as due to dark, cigarette, cloud, blindness) Device, a part as game, a part of emulation etc..It can make in the side mirror leg of eyepiece individually or together with loudspeaker With vibration exciter, to help to create 3D vision-sound-vibration reality environment, such as game, emulation etc..Example Such as, vibration exciter may be mounted to that in each side mirror leg of eyepiece, so that when a certain application is presented on a left side for user's head When the projectile that side is flown over, left side vibration exciter is configured in a manner of emulating the feeling that projectile actually flies over user To vibrate.In addition, the loudspeaker of the side can synchronously apply the sound for imitating that projectile can be issued when flying over user's head Sound.Vibration and/or loudspeaker can be installed on eyepiece in the way of providing a user 3D vibration-audio experience, to increase It is strong to pass through visual experience provided by the content visually shown, such as in 3D visual display content.By this method, user It can be enclosed in more virtual 3D environment of perception.In embodiments, the disclosure may include a kind of interaction head worn by user Formula eyepiece is worn, wherein to include: user observe the optics assembly of ambient enviroment and the content shown by it, be suitble to the eyepiece In the integrated image source that content is introduced to optics assembly and it is suitable for managing the processing equipment of the function of eyepiece, wherein should The structure that wear-type eyepiece has includes: that user is used to observe the frame of ambient enviroment and for frame to be supported on to the head of user Vibration exciter in each of the left side in portion and right side temple and left and right side temple, each vibration exciter are only On the spot in response to the vibration command from processing equipment.Vibration command may be in response to a part as shown content out Virtual projectile, virtual explosion, message instruction, visual cues, warning etc., initiate vibration in one of vibration exciter.Display Content out can be used as user and play a part of emulation, game application, useful application etc. to be provided.Call vibration command The application can locally operate on eyepiece, partly or wholly by external platform run etc., wherein eyepiece has Communication with the external platform interconnects.In addition, eyepiece may include integral speakers as described herein, such as in left and right side In each of temple, wherein vibration command initiates vibration in one of vibration exciter, this shakes in reception in time The audible command for initiating sound when dynamic order in the loudspeaker in the temple of the same side is synchronous.
In embodiments, eyepiece can provide the various aspects of SIGNT (SIGINT), such as existing WiFi, 3G, The signals of communication such as bluetooth in use, be used to collect the equipment near eyepiece wearer and the SIGNT of user.These signals It may be from other eyepieces, such as collecting the information about user friendly known to other;Carry out free unauthorized individual to pick up Other eyepieces, such as signal by being generated when unauthorized user attempts using eyepiece;From other communication equipments (such as nothing Line electricity, cellular phone, pager, walkie-talkie etc.);Electric signal from the equipment that may not be used directly to communication;Etc..By The information that eyepiece is collected can be directional information, location information, motion information, the quantity of communication and/or rate etc..In addition, Information can be collected by the coordinated manipulation of multiple eyepieces, such as in the signal triangulation location for determining the position of signal.
With reference to Figure 15 D, in embodiments, the user of eyepiece 1502D is perhaps able to use the hand 1504D's from them Multiple hands/finger point relative to the visual field (FOV) 1508D for penetrating view definition camera 1510D, is such as answered for augmented reality With.For example, in the example shown, user is adjusting the camera of eyepiece 1502D using their the first finger and thumb The FOV1508D of 1510D.User can adjust FOV1508D using other combinations, such as utilize the combination of finger, finger and thumb The combination of finger, two hands finger and thumb combination, use palm, cup-shaped hand etc..It may make using multiple hands/finger point User can change the FOV1508 of camera 1510D in the mode almost the same with the user of touch screen, wherein hand/finger Difference establishes each point of FOV to establish desired view.However, in this example, between the hand and eyepiece of user Physical contact is not made.Here, camera, which can be command by, is associated each section of user hand to establish or change camera FOV.Order can be any command type as described herein, on the hands movement and eyepiece in the including but not limited to FOV of camera The associated order of physical interface, order, from a certain position of user with close to the movement of eyepiece sensed is associated On the order etc. that receives of command interface.Fingers/hand perhaps can be moved and be identified as order by eyepiece, such as in certain repetition In movement.In embodiments, user can also adjust certain a part of the image projected using the technology, and wherein eyepiece will Image observed by camera is related in a certain respect to the image that projects, the hand/finger point and user in such as view The image projected is related.For example, the image that user may just watch at the same time external environment and project, and user utilizes and is somebody's turn to do Technology changes viewing area, region, the magnifying power etc. projected.In embodiments, user can execute for a variety of reasons Change to FOV, including being zoomed in or out in real time environment from observed scene, observed by the image projected Part zoom in or out, change distribute to the viewing area of the image projected, change environment or the image that projects it is saturating View etc..
In embodiments, eyepiece allows FOV simultaneously.For example, width simultaneously, in, narrow camera FOV can be used, Middle user may make different FOV in the view and meanwhile present (it is i.e. wide for showing entire place, may is that it is static, and it is narrow It for focusing on specific objective, is perhaps moved together with eyes or cursor).
In embodiments, eyepiece perhaps can track eyes by the light reflected via eyes of user, to determine user Just it is being look at the movement of where or eyes of user.Then the information can be used to help to be relevant to the image projected, phase Machine view, external environment etc. carry out the sight of association user, and in control technology as described herein.For example, user can infuse It depending on a certain place on the image that projects and makes a choice, the eyes such as detected with external remote control or with certain are mobile (as blinked).In the example of the technology and Figure 15 E is referred to, the transmitting light 1508E of such as infrared light etc can be from eyes 1504E reflect 1510E, and such as pass through camera or other optical sensors in optical display 502() at sensed.The information is right It can be analyzed to extract eyes rotation from the variation of reflection afterwards.In embodiments, eye-tracking device can be by corneal reflection And pupil center be used as be characterized to be tracked at any time;Using from cornea front reflection and crystalline lens back make It is characterized and is tracked;The feature (such as retinal vessel) from inside of eye is imaged, and tracks this in eyes rotation A little features;Etc..Alternatively, other technologies can be used to track the movement of eyes in eyepiece, such as utilize around eyes, installation Component etc. in contact lenses on eyes.For example, the special invisible glasses with built-in optical component can be provided a user To measure the movement of eyes, reflecting mirror, magnetic field sensor etc..In another example, the electricity being placed in around eyes can be used Extremely measure and monitor potential, using eyes as the constant potential field of bipolar generation, such as its anode in cornea and cathode exists Retina.In this example, the contact electrode on the skin being placed on around eyes, on the frame of eyepiece etc. can be used to obtain electricity Signal.If eyes are shifted around from central part, retina is close to an electrode and cornea is close to an opposite electrode. This variation in bipolar orientation and caused electric potential field leads to the variation for the signal measured.By analyzing these changes Change, it is mobile that eyes can be traced.
The eye gaze direction of user and associated control can be related to visual detector by another example how to apply Placement (passing through eyepiece) in the peripheral vision of user and optional selection (passing through user), such as in order to reducing eyes Clutter in the narrow in the user visual field around direction of gaze where the input of highest vision.Since brain is can be primary Handle that how many message context are limited, and brain most pays close attention to the vision content close to direction of gaze, therefore eyepiece can regard Feel in periphery and provides the visual detector projected as the clue to user.In this way, brain may only need to handle to instruction The detection of device, rather than information associated with indicator, to reduce a possibility that information makes user's excess load.Indicator It can be icon, photo, color, symbol, object of flashing etc., and indicate warning, Email arrival, incoming call, day Journey event, the internal or external processing equipment for needing concern from the user etc..Utilize the visual detector on periphery, Yong Huke Realize it without being divert one's attention by it.Then user can optionally determine to promote content associated with the visual cues to see More information such as watches the visual detector attentively, and opens its content by doing so.For example, indicating incoming electricity The icon of sub- mail, which can indicate that, receives Email.User can pay attention to the icon and select to ignore it (such as if figure Mark a period of time is not activated then icon disappearance, such as by watch attentively or certain other control equipment).Alternatively, Yong Huke The visual detector and the direction by watching the visual detector attentively are noticed to select " to activate " it.The Email the case where Under, when eyepiece detect user eye gaze and icon position consistency when, the openable Email of eyepiece simultaneously shows in it Hold.By this method, user maintains the domination being just concerned to what information, as a result, making minimum interference and making content Service efficiency maximizes.
In embodiments and with certain optical arrangements (such as front lighting LCoS) as described herein in association, two or Feedback between more displays can ensure that display brightness and contrast having the same.In embodiments, Mei Gexian Show that the camera in device may be utilized.Electric current to LED can be controlled and can get color balance, such as by selecting similar matter The LED of amount, output and/or color (such as from similar frequency slot (bin)), it is possible to provide right and left pulsewidth modulation (PWM) value, and Executable periodic calibration.In embodiments, it can be achieved that the calibration of power spectrum.If display due to high outside illumination and by It turns down, then user may know that the calibration to each display.In embodiments, it can create equal between two displays Brightness, color saturation, color balance, coloration etc..This can prevent the brain of user from ignoring a display.In each embodiment In, the feedback system from display can be created, it allows user or another people to adjust brightness etc., so that each display utensil There are constant and/or consistent brightness, color saturation, balance, coloration etc..It in embodiments, may on each display There are luminance sensors, it can be color, RGB, white sensor, full optical sensor etc..In embodiments, sensor can To be that monitoring or inspection pass to LED or by the power sensor of the LED power consumed.User or another people can by increasing or The power supply to LED is reduced to adjust one or more displays.This can be carried out during manufacture, and/or can be in the service life of eyepiece It period and/or periodically carries out.In embodiments, in terms of may having dynamic range.Since LED and/or power supply are gradually dark Get off, it is understood that there may be the brightness that by the power algorithm of accurate adjustment the two can be consistent on a display.In each implementation In example, user and/or manufacturer or eyepiece can adjust LED to follow identical brightness curve when powering and changing.RGB may be present LED, and LED curve can be matched between two displays.Therefore, can be controlled in a dynamic range brightness, color saturation, Color balance, coloration etc..In embodiments, it can be surveyed during manufacture, in dynamic range, during the service life of glasses etc. Measure and control these.In embodiments, brightness equal between two displays, color saturation, color balance, coloration Etc. can by actual creation, or the difference that can be created between the eyes based on user and be easily noticed by the users.In embodiments, The adjustment of brightness, color saturation, color balance, coloration etc. can be executed by user, manufacturer, and/or can be by eyepiece base It is executed automatically in feedback, various programmed algorithms etc..In embodiments, sensor feedback can cause to be saturated in brightness, color Automatically and/or manually adjustment at least one of degree, color balance, coloration etc..
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes user The optics assembly of ambient enviroment and the content shown is observed by it and the collection for content to be introduced to optics assembly At image source, wherein optics assembly includes two or more displays, wherein adjusting at least one of described display At least one of brightness, color saturation, color balance and coloration are saved, so that the two or more displays is bright At least one of degree, color saturation, color balance and coloration are balanced relative to each other within a predetermined range.In each implementation In example, which may include brightness, color saturation, color balance and coloration etc. so that the two or more displays At least one of relative to each other within a predetermined range.In embodiments, to brightness, color saturation, color balance and The adjustment of at least one of coloration etc. can be made based on the detection to the power for being delivered to integrated image source.In each embodiment In, which can be based on power algorithm, so that at least one of brightness, color saturation, color balance and coloration etc. exist It is consistent between two or more displays.In a further embodiment, which can be based on whole optical sensor feedbacks Sensor.In embodiments, can during manufacture, dynamic output range caused by integrated image source it is medium at least At least one of brightness, color saturation, color balance and coloration etc. are adjusted in one.In embodiments, system can fit Together in periodically automaticly inspected relative to each other in the life cycle of eyepiece the two or more displays brightness, At least one of color saturation, color balance and coloration etc..In embodiments, system is suitably adapted for relative to each other certainly At least one of dynamic brightness, color saturation, color balance and coloration for checking the two or more displays etc., and Selectively by described in the brightness of the two or more displays, color saturation, color balance and coloration etc. extremely Few one is set as predetermined value.In addition, an embodiment of the system be suitably adapted for automaticly inspecting relative to each other it is described two or At least one of brightness, color saturation, color balance and coloration of more displays etc., and surveyed based on sensor feedback Amount, selectively will be described in the brightness of the two or more displays, color saturation, color balance and coloration etc. At least one is set as predetermined value.
In embodiments and with certain optical arrangements (such as front lighting LCoS) described herein in association, described two It is equal that contrast between a or more display, which can be adjusted to that equal or user discovers,.In embodiments, it compares Degree can be checked and correspondingly be adjusted for each display, and can be conditioned during manufacturing process aobvious to calibrate and adjust Show device, and can be measured in the fabrication process, in dynamic range, during the service life of glasses etc..In embodiments, it is The contrast of system can be automatically calibrated between two displays and compared to the external world.In embodiments, Yong Huke Compensate the difference between his eyes.Contrast can be compensated the eyesight and/or Undersensing of user by on-demand adjustment.In each reality It applies in example, how contrast ratio can be assembled according to optical module and be changed.As described herein, reducing stray light can be dedicated to being used for Assembling is to provide the technology of high contrast ratio.In embodiments, various types of single pixel brightness and/or the detection of more pixel colors Device can be inserted into optical element string, come some or all to the eye movement range (eyebox) for not fully entering display Light is sampled.It in embodiments and depends on detector puts where in the optical path, Real-time Feedback can be provided to system It compensates rigging error, LED and LCoS panel yield, vanning error, the compensation of hot and cold panel, and/or maintains personal user's calibration. In embodiments, the brightness and contrast of display can be managed by good manufacturing practice.In addition, during manufacture, Can carry out quality analysis test and on demand calibrate display and on demand compensation.In addition, with component in system lifetim Abrasion or system are heated and are cooled during use, and calibration can be modified using the look-up table of offset.In embodiments, The adjustment of brightness, color saturation, color balance, coloration, contrast etc. can be executed by user, manufacturer, and/or can be by Eyepiece is executed automatically based on feedback, various programmed algorithms etc..In embodiments, sensor feedback can cause in brightness, color Automatically and/or manually adjustment at least one of color saturation degree, color balance, coloration, contrast etc..
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes user The optics assembly of ambient enviroment and the content shown is observed by it and the collection for content to be introduced to optics assembly At image source, wherein optics assembly includes two or more displays, wherein adjusting at least one of described display Contrast is saved, so that the contrast of the two or more displays is balanced relative to each other within a predetermined range.? In further embodiment, contrast is adjusted so that it is equal between two or more displays.In each embodiment In, it can be during manufacturing process, the medium adjusting contrast of the dynamic output range caused by integrated image source.In each embodiment In, system is suitably adapted for periodically automaticly inspecting the two or more displays relative to each other on the service life of eyepiece Contrast.In embodiments, system is suitably adapted for automaticly inspecting the two or more displays relative to each other Contrast, and be selectively predetermined value by the contrast settings of the two or more displays.In embodiments, it is System is suitably adapted for automaticly inspecting the contrast of the two or more displays relative to each other, and is surveyed based on sensor feedback The contrast settings of the two or more displays are selectively predetermined value by amount.In embodiments, contrast can It is adjusted to the deficiency of compensation user.In embodiments, contrast can be according in the light that stray light and integrated image source generate At least one be conditioned.In embodiments, it can be adjusted based on the feedback of the detector in the optical path of system pair Degree of ratio.Further, detector may include at least one of single pixel brightness and more pixel color detectors.In each embodiment In, Real-time Feedback can be provided to system and mended to compensate rigging error, LED and LCoS panel yield, vanning error, hot and cold panel At least one of repay and maintain personal user's calibration.In embodiments, the calibration of contrast can be based on one or more The look-up table of offset is conditioned.
In one embodiment, particular optical configuration (such as front lighting LCoS) as described herein allows for camera to be inserted into along light In many positions for learning element string, camera is placed directly on the optical axis.For example, can be placed on LCoS attached for camera sensor Closely, such as the camera 10232 in Figure 102 B.This then allows the measurement in the position to pupil, diameter, speed and direction and to rainbow The direct imaging of film.These measurements and imaging can be used for secure log or load user setting, by measurement capillary Size and/or thickness detect health status, placeholder/bookmark etc. are arranged based on the last watching area in books.Phase The data for the various assemblies about eyes that machine is collected can be used for controlling user interface, determine stress level, monitoring warning, inspection Survey the reaction etc. to outside or the stimulant projected.Due to frontlighting optical device be it is sharp and compact, have it is minimum The camera of pixel can be placed in optical element string, to keep the overall dimensions of optical device small and ensure high resolution graphics Picture.In embodiments, camera can be placed in many components of optical path by being inserted into beam splitter as in Figure 185, but It is that can also allow that for camera to be placed on LCoS PCB, is directly embedded in LcoS silicon substrate or other optical element strings are placed.
In embodiments, when camera is placed directly on the optical axis, camera perhaps it can be seen that or detect eyes or It is immediately seen or detects inside of eye.In embodiments, system can track eyes movement, and detection pupil expands, and measure pupil Position, diameter, speed and the direction in hole, and directly to iris imaging.In embodiments, camera can determine that user is It looks around or user is controlling eyepiece.Only it is caused to send signal to track eyes shifting as an example, camera can sense Dynamic eyes Move Mode, so that it senses the predetermined control order that user may be executed with his eyes.As an example, Camera can identify that the eyes of user are reading the something in user interface based on the mobile mode of eyes of user.In these feelings In condition, camera initiates the detection to a certain group of eye commands and is sent to eyepiece to execute a certain function, such as opens electronics postal Part etc..In embodiments, user, which can be detected, in camera to focus on an object to control the predetermined way of eyepiece, such as logical It crosses and something is focused on the extended period, focus on something, fast move eyes and then focus on the object etc. again.With Camera detects suchlike Move Mode, it can send signal to eyepiece to execute a certain function.It is only as an example, poly- Burnt, transfer sight and again focusing may make camera signals to carry out on the something of eyepiece user intention in the display " double-click ".Certainly, this any quasi-mode and/or algorithm can be used for controlling equipment by the eyes movement of user.In each reality It applies in example, a certain Move Mode can be detected in camera, and when the movement is detected when specific application is used, phase Machine can send signal specific to eyepiece based on this combination.As an example, if e-mail program is open and user Eyes show and read consistent mode, then camera available signal notice eyepiece opens the specific postal that the eyes of user are focused Part.In embodiments, it can be initiated based on the detection of camera for controlling the order of eyepiece.
In embodiments, camera is to the detection in the position of pupil, diameter, speed and direction, retina and/or iris Direct imaging etc. is contemplated that safety measure.As an example, retina scanning, view can be performed in camera when user puts on eyepiece The Database Identification user of on film scanning control eyepiece or long-range storage.In embodiments, if user is identified as eye The owner of mirror or the user of glasses, then it is openable application and provide a user access.If their glasses do not identify User out, then they can lock or prevent all or part of function.In embodiments, user may not be needed this password, Eyepiece can execute this function automatically.In embodiments, when user is unrecognized, camera can steal eyepiece in wearer In the case where obtain identification information about wearer.
In embodiments, eyepiece can be based on detection, position to pupil, diameter, speed and the direction mobile to eyes Detection, to direct imaging of retina and/or iris etc. come execute user diagnosis.For example, diagnosis can be based on pupil dilation.Example Such as, if the pupil of user by with liar it is consistent in a manner of expand, the user is can be detected in camera and/or eyepiece It lies.In addition, if user obtained cerebral concussion, although then the light of specified rate enters eyes, pupil may also change size.Mesh Mirror can alert user, and whether he obtained cerebral concussion.In embodiments, it can be given when soldier, sportsman etc. exit physical activity Their eyepieces, and eyepiece can be used for that user is for example diagnosed as cerebral concussion.Eyepiece can have onboard or divide with eyepiece The customer data base opened, the customer data base store information relevant to each user.In one embodiment, when sportsman leaves When court is to sideline, he can put on one's glasses to execute retina scanning, come identity user, then to pass through detection by database The pupil size of user simultaneously relatively diagnoses or checks user compared with pupil size expected under given illumination condition.If user Data fall beyond the expected range, then glasses can tell user that his pupil is consistent with cerebral concussion was obtained.It can be used similar Purposes detects possible drug poisoning, detection retinal damage, detection eye condition etc..
In embodiments, Organic Light Emitting Diode (OLED) can be used for micro-display and/or sensor herein Application, and can be with such as OLEDCam(OLED camera) etc Fraunhofer system be used together, or otherwise It is used for the detection mobile to eyes, or is otherwise used together with eyepiece to illuminate eyes of user etc..In each implementation In example, the equipment for detecting eyes movement can be placed on the optical axis of user along optical element string.In embodiments, micro- ruler Degree optical launcher and receiver can be integrated in same chip.They can be implemented as two-way or single with array type structure To micro-display.In embodiments, equipment can be presented and/or capture simultaneously image.Micro-display may be for individualizing The basis of the system of information, and information can be presented to user and identify the interaction that user makes.By equipped with bi-directional display The eyepiece of device, user can perceive environment as usual, and additional information can be presented.Visual information is adaptable in the operation of system Hereafter, user can be interacted by the movement or movement of eyes.In embodiments, CMOS chip may include being located at one piece of substrate On micro-display and camera, the central member of substrate is the nested active square being made of OLED pixel with photodiode Battle array.In embodiments, pixel unit can be made of red-green-blue-white and R-G-B-photodiode pixel unit etc..
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein the eyepiece includes using The optics assembly of ambient enviroment and the content shown is observed by it, is suitable for the content that would indicate that introducing optics group in family The integrated image source of piece installing and it is placed on the camera in optics assembly along optical axis, so that camera can be observed user's At least part of eyes.In embodiments, camera can be suitable for capturing eyes, pupil, retina, eyelid and/or eyelash The image of hair.In embodiments, it can be initiated based at least one image of cameras capture for controlling the order of eyepiece.? In each embodiment, the diagnosis of user can be based at least one image of cameras capture.The mark of user may be based on cameras capture At least one image.As an example, diagnosis may include the diagnosis to cerebral concussion.In each embodiment of the system, user's Mark can be deployed as the security feature of eyepiece.In embodiments, integrated image source can be during camera carries out picture catching Illuminate eyes.Further, the light from image source can be modulated during camera carries out picture catching.In embodiments, Camera may include one or more Organic Light Emitting Diodes (OLED).In embodiments, the eyes of user or listed herein Other positions including iris, pupil, eyelid, eyelashes etc. can be illuminated by various light, LED, OLED etc..In each implementation In example, imaging technique, the data for capturing eyes, mark etc. can be used for the illumination of eyes of user.
In one embodiment, which may include the interaction wear-type eyepiece that user wears, and wherein the eyepiece includes user The optics assembly of ambient enviroment and the content shown is observed by it, is suitable for the content that would indicate that introducing optics assembling The integrated image source of part and the equipment moved for detecting eyes.In embodiments, for detecting the mobile equipment of eyes It may include the minute yardstick optical launcher and receiver being integrated in same chip.In embodiments, which may include CMOS chip, the CMOS chip include micro-display and camera on one piece of substrate.In embodiments, it is moved for detecting eyes The dynamic equipment can be along the optical axis that optical element string is placed on user.
In embodiments, camera is arranged in optics assembly along optical axis, so that eyes of the camera looks into fee to user At least part and the one or more in eyes, pupil, retina, eyelid and eyelashes can be imaged.Integrated place The eyes that reason device and camera are adapted to tracking user are mobile;Measure pupil dilation, pupil position, pupil diameter, pupil speed and At least one of pupil direction;User's eye that the is eyes of user for being expected to control or order is mobile and being used to read or watch attentively Eyeball is mobile mutually to be distinguished;By the eyes of user it is mobile be used as processor be used to control integrated processor or interaction wear-type eyepiece The order of function;And it is the eyes of user are mobile as the life for controlling the equipment outside user and outside interaction wear-type eyepiece It enables.Diagnosis or mark to user can be based at least one image of cameras capture, such as cerebral concussions.It can be by portion to the mark of user Administration is the security feature of eyepiece.The system may include setting based on eyes from the user are mobile to control or signal outside Standby user input interface.Camera be suitably adapted for capture eyes image, wherein image with including other images of eyes Database compared to relatively come indicate diagnosis.The optical axis in integrated image source and the optical axis of camera can be different.Integrated image source Optical axis and at least part of optical axis of camera can be identical.
In augmented reality eyepiece such as camera, the minute yardstick optical launcher being integrated in same chip and receiver or It include that the equipment of CMOS chip of micro-display and camera etc can detect the eyes movement of user on one piece of substrate.Integrated figure Image source be adaptable to it is following at least one: camera carry out picture catching during the light from image source is modulated and is shone Bright eyes.Camera may include one or more Organic Light Emitting Diodes (OLED).It can be along light for detecting the mobile equipment of eyes Element displacement is learned on the optical axis of user or on the axis different from the eyes of user.Integrated processor is suitably adapted for user The mobile order being construed to for equipment or external equipment in operating interactive wear-type eyepiece of eyes.
A kind of method that the eyes detecting user are mobile may include wearing wear-type eyepiece, which includes user The optics assembly of ambient enviroment and the content shown is observed by it, is adapted to the content that would indicate that introducing optics assembling The integrated processor of part and integrated image source and camera, the eyes with camera and integrated processor detection user are mobile, and Equipment is controlled by eyes movement and integrated processor, wherein the movement of at least one eye eyeball of camera detection user is simultaneously The movement is construed to order.Integrated processor can be mobile in the eyes as order and be expected to the eyes movement watched attentively Between distinguish.The method may include by the mobile order for being construed to execute a certain function of scheduled eyes.The method can Including scanning at least one eye eyeball of user to determine the mark of user.The method may include scanning at least one eye of user Eyeball is to diagnose the physical condition of user.Camera may include at least one Organic Light Emitting Diode (OLED).Specific eyes are mobile It can be interpreted specifically to order.Eyes movement can be selected from the group of following composition: blink blinks repeatedly, counting of blinking, blinks Eye rate, eyes open-are closed and (blink at a slow speed), watch tracking attentively, the eyes to side are mobile, upper and lower eyes movement, from side to The eyes movement of side, a series of eyes movement by positions, the eyes movement to specific position, the stop in a certain position Time, towards the watching attentively of fixed object, watching attentively by the specific part of the eyeglass of wear-type eyepiece.The method may include leading to Eyes movement and user input interface are crossed to control equipment.The method may include around being captured with the camera or second camera The view of environment is shown to user.
In embodiments, eyepiece can using subconsciousness control aspect, as around wearer image, with lower than consciousness sense Image that the rate known is presented to the user, the subconsciousness perception of scene that viewer is seen etc..For example, can by eyepiece with Image is presented to wearer by the rate that wearer does not perceive, but discover with making wearer's subconsciousness be presented it is interior Hold, such as reminds, alerts and (such as request wearer to increase the warning to the concern rank of something, but do not have to too many so that user Need entirely realize remind), it is related with the direct environment of wearer indicate (such as eyepiece detects wearer in the visual field of wearer Possible interested something, the instruction attract wearer to the interest of the object) etc..In another example, eyepiece can pass through brain Activity monitoring interface to wearer provide indicator, wherein before Individual Consciousness identifies image to them big intracerebral telecommunications Number excitation.For example, brain activity monitoring interface may include electroencephalogram (EEG) sensor (s) to observe currently in wearer Brain activity is monitored when environment.Start " discovering " ambient enviroment when sensing wearer by brain activity monitoring interface eyepiece When a certain element, eyepiece can provide level of consciousness feedback to wearer, so that wearer more perceives the element.For example, wearing Wearer may unconsciously start to perceive to be seen known face (such as friend, suspect, famous person) in crowd, and eyepiece is to wearing Person provides vision or audio instruction to make wearer more be consciously noticeable the individual.In another example, wearer can look into See the product for causing their attentions with a certain subconscious level, eyepiece to wearer provide consciousness instruction, about the product more Multi information, the enhancing view of the product, about link of more information of the product etc..In embodiments, eyepiece will wear The ability that the reality of wearer expands to subconscious level aloow eyepiece to wearer provide beyond wearer to they week Enclose the augmented reality of the normal consciousness experience in the world.
In embodiments, eyepiece can have multiple modes of operation, wherein to the control of eyepiece based in part on hand Position, shape, movement etc. control.In order to provide the control, eyepiece can detect hand/finger shape using hand recognizer Those hands configuration (may be combined with the movement of hand) is then associated as order by shape.In reality, due to that may only have Finite Number The hand of amount configures and movement can be used for order eyepiece, and the configuration of these hands may need the operation mode according to eyepiece and being repeated makes With.In embodiments, can to eyepiece being assigned from a Mode change to next mode specific hand configure or movement, thus Allow the reuse to hands movement.Such as and refer to Figure 15 F, the hand 1504F of user can be moved to the camera on eyepiece The visual field in, according to the mode, then which can be interpreted different orders, such as circular motion 1508F, pass through the visual field Movement 1510F, move back and forth 1512F etc..In the example that one simplifies, it is assumed that there are two kinds of operation modes, mode one are used In view of the panning from the image projected, mode two is for scaling the image projected.In this example, user may think Will using from left to right, the hands movement of finger pointing come order panning movement to the right.However, user may also desire to using a small left side Carry out order image to right, finger pointing hands movement and amplifies bigger.In order to allow this hands movement for two kinds of command types Dual use, eyepiece can be configured to the mode being currently located according to eyepiece differently to explain hands movement, wherein specific hand Movement has been assigned for Mode change.For example, clockwise rotation can indicate that the transformation from panning to zoom mode, and Counter-clockwise rotary motion can indicate that from the transformation for zooming to panning mode.The example be intended to it is illustrative rather than with any Mode is limited, those skilled in the art will appreciate that how this general technology can be used to realize using hand and finger Various order/mode configurations, such as hand-finger configuration-movement, both hands configuration-movement.
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes user The optics assembly of ambient enviroment and the content shown is observed by it, wherein the optics assembly includes correcting user pair The correcting element of the view of ambient enviroment, for process content to be shown to the integrated processor of user and for by content Introduce the integrated image source of optics assembly;And to the integrated camera apparatus that posture is imaged, wherein the integrated processing The posture is identified and is construed to command instruction by device.Control instruction can provide the manipulation to content to be shown, be communicated to outside Order of equipment etc..
In embodiments, the control to eyepiece can be enabled by eyes movement, movements of eyes etc..For example, in mesh There may be the cameras for reviewing wearer's eyes on mirror, and wherein eyes are mobile or movement can be interpreted command information, such as logical Cross blink, blink repeatedly, blink countings, blink rate, eyes are opened-are closed, watch tracking attentively, the eyes movement, upper and lower to side Eyes are mobile, eyes from side to side are mobile, a series of eyes movement by positions, the eyes shifting to specific position Residence time in dynamic, a certain position watches (corner of the eyeglass of such as eyepiece) attentively, by the specific of eyeglass towards fixed object It is partial to watch attentively, watch attentively in real world objects etc..It is shown in addition, eyes control aloows viewer to focus on from eyepiece The specified point on image shown, and because camera perhaps can be related to the point on display by the direction of observation of eyes, mesh Mirror perhaps can just see where and movement (such as blink, touch interface equipment, position sensing of wearer by wearer Equipment it is mobile etc.) combination explain order.For example, viewer perhaps can see certain an object on display, and pass through Position sensing device realizes ground, selects the object by the movement of finger.
In certain embodiments, glasses can be equipped with the eye of the movement of the eyes (or preferably eyes) for tracking user Eyeball tracking equipment;Alternatively, glasses can be equipped with the sensor for six-freedom degree mobile tracking, i.e. head mobile tracking. These equipment or sensor can irrigate this city from the Chronos Vision GmbH and Massachusetts, United States of Berlin, Germany ISCAN is obtained.It is mobile that retina scanners can also be used for tracking eyes.Retina scanners also may be mounted to that augmented reality eye It in mirror, and can be obtained from various companies, such as SMI, Yi Jiqian of the Tobii of Stockholm, SWE, Germany Tai Ertuo The ISCAN stated.
Augmented reality eyepiece further includes that user input interface as shown is used to allow user control device.It is set for controlling Standby input may include any sensor in sensor discussed above, and may also include Trackpad, one or more functions Key and any other suitable Local or Remote equipment.For example, eye-tracking device can be used for controlling another equipment, such as Video-game or external trace device.As an example, Figure 29 A is depicted with the eyes for being equipped with other places discussion in the document The user of the augmented reality eyepiece of tracking equipment 2900A.Eye-tracking device allows eyepiece to track the single eyes of user (or most Well eyes) direction, and mobile will be sent to the controller of eyepiece.Control system includes augmented reality eyepiece and is used for military The control equipment of device.The control equipment for weapon is moved and then can be transmitted to, the control of weapon controlled device controls equipment In the sight of user.Then the movement of eyes of user is converted into the letter of the movement for controlling weapon by suitable software Number, such as quadrant (range) and orientation (direction).Additional control can be used together with eyes tracking, such as utilize the tracking of user Pad or function key.Weapon can be the large caliber weapon of such as howitzer or mortar etc, or can be such as machine gun etc Minor-caliber weapon.
Then the movement of eyes of user is converted into the signal of the movement for controlling weapon by suitable software, such as The quadrant (range) of weapon and orientation (direction).It is additional to control the single that can be used for weapon or continuous transmitting, such as utilize The Trackpad or function key of user.Alternatively, weapon can be fixed and non-directional, such as inbuilt thunder or shape charge Weapon, and can be protected by safety device, such as order by requiring specific coding.The user of augmented reality equipment can pass through hair Send code appropriate and order to activate weapon, without using eyes tracking characteristics.
In embodiments, the control to eyepiece can be enabled by the posture of wearer.For example, eyepiece can have outward Camera that (such as forward, to side, downwards) sees simultaneously by the posture of the hand of wearer or mobile is construed to control signal.Hand signal It may include making hand by camera, converting before camera hand position or symbolic language, be directed toward real world objects (such as activating The enhancing of object) etc..Hands movement may be additionally used for the object that manipulation is shown on the inside of translucent eyeglass, and such as mobile pair As, target rotation, delete object, the screen in opening-closing image or window etc..Although having used hand in the example of front Movement, but the object that any position of body or wearer hold or wear can also be used to carry out posture knowledge by eyepiece Not.
In embodiments, head movement control can be used for sending to eyepiece and order, wherein such as accelerometer, gyro The motion sensor of instrument or any other sensor as described herein etc may be mounted to that on the head of wearer, on eyepiece, cap In son, the helmet it is medium.With reference to Figure 14 A, head movement may include the quick movement on head, and such as head is sudden dynamic before or after 1412, head sudden dynamic 1410, head above and/or under swings suddenly from side to the other side, stops in a certain position, such as To side, movement and it is held in place etc..Motion sensor can be integrated in eyepiece, by with the wired of eyepiece or Be wirelessly connected and be mounted on the head of user or the overcover of head (such as cap, the helmet) in.In embodiments, user Wearable interactive wear-type eyepiece, wherein eyepiece includes the optics group that user watches ambient enviroment and the content shown by it Piece installing.Optics assembly may include correcting user to the correcting element of the view of ambient enviroment, for process content for display to The integrated processor of user and for by content introduce optics assembly integrated image source.Multiple head movement sensing controls At least one of control equipment can or eyepiece integrated with eyepiece it is associated, they based on the predefined head movement feature of sensing come Control command is provided as command instruction to processor.Head movement feature can be the swing of user's head, so that the pendulum Dynamic is the obvious movement dissimilar with generic head movement.The obvious movement can be the sudden dynamic movement on head.Control instruction can It provides to for showing the manipulation of content, the order for being communicated to external equipment etc..Head movement control can be with other control mechanisms It is used with being combined, such as carry out activation command using another control mechanism as discussed herein and holds head movement Row.For example, wearer may wish to move right certain an object, and controlled by eyes as discussed herein, It selects the object and head movement is activated to control.Then, by the way that their head is tilted to the right, object can be command by and move right It is dynamic, and the order is terminated by eyes control.
In embodiments, eyepiece can be controlled by audio, such as pass through microphone.Audio signal may include that voice is known Not, speech recognition, voice recognition, sound detection etc..It can be talked about by the microphone on eyepiece, throat's microphone, jawbone microphone, suspension type Cylinder, headphone, earplug with microphone etc. detect audio.
In embodiments, order input can provide multiple control functions, such as open/close eyepiece projector, open/ Audio is closed, camera is opened/closed, opens/closes augmented reality projection, opens/closes GPS, being interacted with display (as selected Select/receive shown function, reset the image captured or video etc.), interact with real world (such as capture image or video, To shown book page turning etc.), with insertion or external mobile device (such as mobile phone, navigation equipment, musical instruments, VoIP etc.) movement, the browser control (such as submitting, next result) for internet, Email control are executed (as read Email, display text, text compressing, typesetting, selection etc.), (such as save location transfers preservation for GPS and Navigation Control Position, direction is shown, is consulted a map on place) etc..In embodiments, eyepiece or its components can pass through sensor It indicates and is automatically switched on and/or closes, such as passed from IR sensor, accelerometer, force snesor, micro switch, capacitive character Sensor passes through eyes tracking detection equipment etc..For example, when user takes off eyepiece from their head, by sensing the mesh Mirror no longer has the capacitive sensor of physical contact (such as at the bridge of the nose of user's nose) with user's skin, and eyepiece can be automatic It closes.It will be appreciated by those skilled in the art that for sensing the other similar configuration when eyepiece is taken off.In embodiments, mesh Mirror can sense when detachable member is attached to eyepiece or removes from eyepiece, and can open/close eyepiece using the sensing Various aspects.For example, a part of optical device is dismountable, and when the optic part is removed, to eyepiece system The power supply of half just powers off to save the electric power of battery.The disclosure may include power supply management device, wherein power supply management device The electric power for being supplied to selected eyepiece component is controlled in response to sensor.Eyepiece may be mounted to that with nose support and foldable In the frame of temple, wherein the hinge attachment of frame is in folding temple, and wherein sensor may be mounted to that the nose support of frame In, in temple, in hinge etc..Selected component can be image source, processor etc..It is electric when user does not wear eyepiece Source control equipment can be at suspend mode, and wherein suspend mode may include periodically reading sensor, wherein working as power management Equipment detects that it is transformed into awakening mode and powers to eyepiece when user is just adorning oneself with eyepiece.Power supply management device can be based on mesh Remaining electric power, network availability, rate of power consumption etc. reduce the power supply to component in the use of mirror function, integrated battery. User preference can be based on by reducing power supply.User can ignore power supply by ordering and reduce.When power supply is reduced, can pass through The user interface of eyepiece provides a user an instruction.If the luminance level of image source is dropped due to reducing the power supply to image source Low, then the electrochromism density in optics assembly can increase.
In embodiments, eyepiece can provide a user 3D display imaging, such as by conveying stereopsis, automatic stereo Change, the hologram that computer generates, body show that image, perspective view/space image, view sequential display, electronic holographic are shown Device, parallax " dual-view " display and parallax panoramagrams, reimaging system etc., to create 3D depth perception for viewer. It shows that the left eye and right eye that different images is presented to the user can be used in 3D rendering to user, such as has in left optical path and right optical path In the case where having certain optical module for distinguishing the image, different images are projected to the left eye and right eye of user in projector apparatus In the case where, etc..It, should including by the optical path of the eyes of optical path to user may include graphic display device from projector apparatus Graphic display device forms the visual representation of certain an object with three physical dimensions.Integrated processor or outside in such as eyepiece The processor of processor in equipment etc can provide 3D rendering processing as at least step for generating 3D rendering to user.
In embodiments, line holographic projections technology can be used for that 3D imaging effect is presented to user, and such as computer generates Hologram (CGH), it is the method for being digitally generated holographic interference pattern.For example, hologram image can be thrown by holographic 3D display device It penetrates, such as the interference based on coherent light is come the display of work.The hologram that computer generates this have the advantage that people want The object shown is completely without with any physical reality, that is to say, that they can be fully generated as " synthetic hologram Figure ".In the presence of a variety of different methods of the interference figure for calculating CGH, including from holographic information and calculate reduction field And in calculating and quantification technique.For example, Fourier transformation method and point-source hologram are two examples of computing technique.Fu Li Leaf transformation method can be used for each depth plane of simulated object to the propagation of holographic plan, and wherein the reconstruct of image can be sent out Life is in far field.In instantiation procedure, possible there are two steps, wherein the light field first in remote viewing person plane is calculated, so This is switched back into lens plane by Fourier transformation afterwards, wherein to be each object plane by the wavefront of hologram reconstruction Superposition of the Fourier transformation in depth.In another example, target image can be multiplied by the phase for applying inverse Fourier transform Bitmap.Then intermediate hologram can be generated by shifting image product, and be combined to create final collection.The final collection of hologram Then kinoform can be formed by approximation shows that wherein kinoform is phase hologram, wherein object to user's carry out sequence The phase-modulation of wavefront is registered as surface undulation profile.In holography of point objects drawing method, object is broken down into spontaneous luminous point, In to each point source calculating elements hologram, synthesize final histogram by being superimposed all element histograms.
In one embodiment, 3D or holographic imaging can be enabled by dual projector system, two of them projector stacks On top of each other for 3D rendering output.It can be by controlling mechanism as described herein or by capturing image or signal (such as hand The heart stretches hand upward, SKU, RDIF are read etc.) enter line holographic projections mode.For example, the wearer of eyepiece may be viewed by one Letter " X " on hardboard, this projector for making eyepiece enter holographic mode and opening second, stacking.Control technology can be used To carry out the selection to show what hologram.Projector can project to hologram on the letter on hardboard " X ".Phase The image projected simultaneously is moved with the movement of alphabetical " X " in the position of associated software traceable alphabetical " X ".In another example In, eyepiece can scan SKU, such as toy builds the SKU in external member, and can access from line source or nonvolatile memory At the 3D rendering built of toy.Control as described herein can be used with the interaction of hologram (rotate it, amplification/diminution etc.) Making mechanism carries out.Scanning can be enabled by associated bar code/SKU scanning software.It in another example, can be in space In or surface on project keyboard.Holographic keyboard can be used for or control any associated applications/functions.
In embodiments, eyepiece equipment can be used for relative to true environment object (such as desk, wall, Vehicular instrument panel Deng) position of locking dummy keyboard, wherein dummy keyboard then not with wearer it is mobile they head and move.Show one In example and refer to Figure 24, user may be sitting in front of desk and wear eyepiece 2402, and it is desirable that apply in such as word processing, Text is inputted in the application of web browser, communications applications or the like.User perhaps can generate dummy keyboard 2408 or other friendships Mutual formula control element (such as virtual mouse, calculator, touch screen) is used to input.User can provide for generating dummy keyboard 2408 order, and indicate using gesture 2404 the fixation position of dummy keyboard 2408.Dummy keyboard 2408 then can phase The a certain position being such as fixed on desk 2410 in space is kept fixed for external environment, wherein even if in user When rotating their head, the position of the dummy keyboard 2408 is also maintained on desk 2410 by eyepiece equipment.That is, mesh Mirror 2402 can compensate for the head movement of user to remain in the User of dummy keyboard 2408 on desk 2410.Each In embodiment, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing ambient enviroment and display The optics assembly of content out.Optics assembly may include correcting user to the correcting element of the view of ambient enviroment, be used for Process content come be shown to user integrated processor and for by content introduce optics assembly integrated image source.It can It provides that ambient enviroment is imaged and user gesture is identified as the order of Interactive control element position and (such as moves in a specific way Hand that is dynamic, positioning in a specific way-finger configuration etc.) integrated camera apparatus.It is ordered in response to Interactive control element position It enables, regardless of the view direction of user changes, the position of Interactive control element then can be relative to pair in ambient enviroment As and be kept fixed in position.By this method, user perhaps can be to use the almost identical side of physical keyboard with them Formula utilizes dummy keyboard, and wherein dummy keyboard is maintained in same position.However, in the case where dummy keyboard, without such as " physical limit " of gravity etc can position keyboard wherein to limit user.For example, user can stand against wall, and by key Disk position is arranged on wall etc..It will be appreciated by those skilled in the art that " dummy keyboard " technology can be applied to any control Device, such as virtual mouse, virtual touchpad, virtual game interface, virtual telephony, virtual computing device, virtual paintbrush, virtual drawing Plate etc..For example, virtual touchpad can be visualized by eyepiece to user, and positioned by using gesture by user, and replace It is used for physical touch plate.
In embodiments, visualization technique can be used in eyepiece equipment, is such as similar to parallax, trapezoidal distortion by application The projection of object (such as dummy keyboard, keypad, calculator, notebook, control stick, control panel, book) is in by equal deformation Now on the surface.For example, can be by application ladder with the appearance that suitable perspective view projects the keyboard on the desktop in face of user Shape distortion effect is assisted, wherein be distorted by the projection that eyepiece provides a user so that it, which is appeared as, lies in table On the surface of son.In addition, these technologies are dynamically applied, to also provide conjunction even if user moves around about the surface Suitable perspective view.
In embodiments, eyepiece equipment can provide gesture recognition, gesture recognition can be used for being provided with eyepiece keyboard and Mouse experience.For example, system perhaps can be real using the image of the keyboard for the middle and lower part for being covered on display, mouse and finger When track finger position to enable virtual desktop.By gesture recognition, without wired and external power supply equipment just can be carried out with Track.In another example, it is not necessarily to wired and external power supply, fingertip location can be tracked by the gesture recognition that eyepiece carries out, it is all Such as utilize the gloves for there are passive RFID chips in each finger tip.In this example, each RFID chip can have the sound of their own Feature is answered, so that multiple fingers can be read simultaneously.RFID chip can be matched with glasses, so that they can be near Other RFID phases of work are distinguished.Glasses can provide signal to activate RFID chip and have two or more receiving antennas. Each receiving antenna can be connected to a phase measuring circuit element, which then determines algorithm to position Input is provided.Position determines that velocity and acceleration information can also be provided in algorithm, and the algorithm can finally be mentioned to eyepiece operating system For keyboard and mouse message.It in embodiments, can be by the phase difference between receiving antenna come really using two receiving antennas The position of orientation of fixed each finger tip.Then relative phase difference between RFID chip can be used to determine the radial position of finger tip.
In embodiments, eyepiece equipment can be used visualization technique by the medical scanning previously obtained (such as X is penetrated Line, ultrasonic wave, MRI, PET scan etc.) projection be presented on the body of wearer.Such as and refer to Figure 24 A, eyepiece can visit Ask the radioscopic image of the hand getting collection to wearer.Then eyepiece can check the hand 2402A of wearer using its integrated camera, And the projection image 2404A of X-ray is superimposed upon on hand.In addition, eyepiece perhaps can be protected when wearer moves their hand Image superposition is held, and is watched attentively relative to each other.In embodiments, the skill can also be realized when wearer is seeing mirror Art, wherein eyepiece exchanges an image on reflected image.The technology is used as a part of diagnostic program, is used for physics Rehabilitation, encouragement during treating are moved and are dieted, explain diagnosis or situation etc. to patient.Image can be wearer image, General image etc. from medical conditions image data base.General superposition can show for physical condition be it is typical certain If the internal problem of type, about follow particular routine up to body after a period of time will appear to projection how, etc..? In each embodiment, the external control devices of such as indicating needle controller etc allow the manipulation to image.Further, image Superposition can synchronize between multiple people, everyone wears eyepiece as described herein.For example, patient and doctor can throw image It is mapped to patient on hand, wherein doctor can explain body illness now, while patient watches the synchronous images of the scanning projected And the explanation of doctor.
In embodiments, eyepiece equipment, which can be used for removing in dummy keyboard projection, there is the part of intervening obstruction (as used The stick shift at family is lived, wherein being not intended to keyboard projecting user on hand).In one example and with reference to Figure 30, eyepiece 3002 The dummy keyboard 3008 projected can be provided to wearer, such as on the table.Wearer and then accessible dummy keyboard 3008 To be keyed in " thereon ".Since keyboard is only the dummy keyboard projected rather than physical keyboard, not to projection In the case that image out makees certain compensation, the virtual machine projected would be projected the back of the hand "upper" of user.However, such as In this example, eyepiece can provide compensation to the image projected, so that the hand 3004 of wearer hinders dummy keyboard The part of expected projection on the table can be removed from the projection.That is, it may be undesirable to keyboard projection 3008 Some parts visualized user on hand, therefore eyepiece subtract dummy keyboard projection it is co-located with the hand 3004 of wearer Part.In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing surrounding ring The optics assembly in border and the content shown.Optics assembly may include correction member of the correcting user to the view of ambient enviroment Part, for process content come be shown to user integrated processor and for by content introduce optics assembly integrated figure Image source.The content shown may include Interactive control element (such as dummy keyboard, virtual mouse, calculator, touch screen).Collection The physical feeling of user can be imaged in a certain physical feeling and Interactive control element interactions of user at camera apparatus, Wherein view of the processor based on user, by the user's body portion being confirmed as with imaging for subtracting Interactive control element The co-located a part in position removes the part of Interactive control element.In embodiments, this part projects image Removal technology can be applied to the image and barrier that other are projected, and be not intended to be limited to hand on the virtual keyboard This example.
In embodiments, eyepiece equipment can provide residence to any virtual content being shown on " true " world content Between obstacle.If a certain referential is confirmed as being placed in interior at a certain distance, worn between virtual image and viewer Any object crossed can be removed from shown content out, so as not to be present in specific range to expected shown information The user at place causes to be interrupted.In embodiments, various adjustable focus technologies also can be used increase to the content watched it Between the perception apart from level.
In embodiments, eyepiece equipment can be used for from such as dummy keyboard such as by finger, stylus, whole hand Etc. the ability that a series of character contacts streaked determine expected text input.Such as and refer to Figure 37, eyepiece may be Dummy keyboard 3700 is projected, wherein user wishes to input word " wind ".In general, user will discretely press corresponding to " w ", so Each key position of " i ", then " n ", last " d " afterwards, and a certain equipment associated with eyepiece (camera, accelerometer etc., such as It is described herein) each position will be construed to correspond to the letter of the position.However, system perhaps can also monitor the hand of user Refer to or movement of other pointing devices across dummy keyboard or draw is swept, and determines the mobile best fit matching of pointer.In the attached drawing In, pointer starts at character " w " and the inswept path 3704 by character e, r, t, y, u, i, k, n, b, v, f and d, is parked in d. Eyepiece can be observed the sequence and such as determine the sequence by input path analysis device, the sequence feed-in word that will be sensed With search equipment, and best fit word is exported, is " wind " as text 3708 in the situation.In embodiments, eyepiece It can monitor movement of the pointing device across keyboard and more directly determine word, such as be known by the matching of automatic whole-word, mode Not, Object identifying etc., wherein certain " separator " indicates the space between word, pause in the movement of such as pointing device, Tap, circumnutation of pointing device of pointing device etc..For example, whole stroke sweep path can be with mode or object recognition algorithm one Using entire word and the movement of the finger of user is related to form the discrete mode that word is formed by each character Connection, wherein the pause between moving is as the description between word.Eyepiece can provide best fit word, best fit word Inventory etc..In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing surrounding The optics assembly of environment and the content shown.Optics assembly may include correction of the correcting user to the view of ambient enviroment Element, the integrated figure for process content for display to the integrated processor of user and for content to be introduced to optical module Image source.Shown image out may include interactive keyboard control element (such as dummy keyboard, calculator, touch screen), wherein Keyboard Control element is associated with input path analysis device, word matched search equipment and keyboard input interface.User Ke Tong Cross the cunning by a pointing device (such as finger, stylus) to want the substantially sequence as the word of text input by user Dynamic movement slips over the character keys of keyboard input interface to input text, wherein input path analysis device determines institute in input path The character of contact, word matched equipment find the best word matched of contacted character string and make the best word matched To input text input.In embodiments, it can be the something in addition to keyboard with reference to display content, such as hand-written text This sketching board or similar to control game or real machine people and aircraft 4 to other interfaces of control stick pad with reference to etc.. Another example can be virtual jazz drum, the colored pad such as made a sound with user's " tap ".Eyepiece is explained across a certain The ability of the mode of the movement on surface allows reference projection content, is directed toward to give user's something, and provide a user Vision and/or audible feedback.In embodiments, " movement " that eyepiece detects can be user's eyes of user when seeing surface Movement.For example, the equipment that eyepiece can have the eyes for tracking user mobile, and by possessing the virtual key projected Both the content display position of disk and the direction of gaze of eyes of user, eyepiece are perhaps able to detect view of the eyes of user across keyboard Line is mobile, and movement is then construed to text as described herein.
In embodiments, eyepiece can provide through gesture " sky-writing " come the ability of order eyepiece, such as wearer It is drawn in the sky in the visual field of the eyepiece camera of insertion using their finger and scans out letter, word etc., wherein eyepiece is by finger Letter, the word, symbol that movement is construed to for ordering, stamped signature, writes, sends e-mails, sending the documents this etc..For example, wearer The technology can be used to utilize " aerial stamped signature " to sign document.The technology can be used to write text, such as in electronics in wearer In mail, text, document etc..The Symbol recognition made by hands movement can be control command by wearer's eyepiece.In each implementation In example, as described herein, the gesture identification that can be explained by the image of eyepiece cameras capture or set by other input controls Standby (being such as mounted on the finger of user, the Inertial Measurement Unit (IMU) in the equipment on hand) Lai Shixian sky-writing.
In embodiments, eyepiece equipment can be used for that the content shown corresponding with the label identified, the mark is presented Note indicates the intention for showing the content.That is, can order eyepiece shown based on scheduled external visual cues are sensed Show specific content.Visual cues can be image, icon, photo, face recognition, hand configuration, body configuration etc..What is shown is interior Hold can be transferred out for use interface equipment, facilitate when user reaches a certain travel locations help user find it is a certain The navigation of position, the advertisement when eyepiece watches a certain target image, profile rich in information etc..In embodiments, vision In the memory that label clue and their associated contents for display can be stored on eyepiece, it is stored in outer computer Be imported into storage equipment and on demand (according to geographical location, with the degree of approach, the order of user etc. that trigger target), by the Tripartite's generation etc..In embodiments, the wearable interactive wear-type eyepiece of user, wherein eyepiece includes that user passes through its viewing week The optics assembly in collarette border and the content shown.Optics assembly may include school of the correcting user to the view of ambient enviroment Positive element, the collection for process content for display to the integrated processor of user and for content to be introduced to optics assembly At image source.It can provide the integrated camera apparatus to external visual cues imaging, wherein integrated processor identifies external view line Rope simultaneously is construed as showing the order of content associated with the visual cues.With reference to Figure 38, in embodiments, visual line Rope 3812 can be included in a certain direction board 3814 in ambient enviroment, wherein the content projected is associated with advertisement.It should Direction board can be billboard, and the advertisement is the personalized advertisement of the preference profile based on user.Visual cues 3802,3808 The content that can be gesture, and project can be the dummy keyboard 3804,3810 projected.It is come from for example, gesture can be The thumb and index finger gesture 3802 of first user hand, and dummy keyboard 3804 is incident upon on the palm of the first user hand, wherein using Family can be keyed on the dummy keyboard with second user hand.Gesture 3808 can be the thumb and index finger gesture group of user's both hands It closes, and dummy keyboard 3810 is projected between the both hands of user according to the configuration of the gesture, wherein user is able to use user The thumb of hand key on the dummy keyboard.Virtual clue can be provided to the wearer of eyepiece by scheduled external view line Rope automation resource associated with using the expected result of content way is projected, so that wearer does not have to oneself scounting line Rope.
In embodiments, eyepiece may include that the visual identity language translation for providing the translation of visual presentation content is set It is standby, for road sign, menu, billboard, shop sign, books, magazine etc..Visual identity language translation equipment can utilize light It learns character recognition and carrys out the identifier word mother from content, alphabetic string and word and expression are matched by translation database.The ability It can be completely contained in eyepiece, such as using off-line mode, or be at least partly comprised in external computing device, Such as on external server.For example, user may be in foreign country, wherein the wearer of eyepiece does not understand direction board, menu etc., but It is that eyepiece can provide translation for it.These translations can behave as annotating for a user, (all with translation substitution foreign language word On direction board), user etc. is supplied to by audio translation.By this method, wearer will not have to make great efforts to search word translation, But word translation can be automatically presented with the video.In one example, the user of eyepiece may be Italian and come the U.S., Ta Menyou Explain a large amount of road signs in order to the needs of safe driving.With reference to Figure 38 A, the Italian subscriber of eyepiece is seeing the parking in the U.S. (STOP) direction board 3802A.In this example, eyepiece can identify the letter on the direction board, and word " stop " is translated into meaning The parking " arresto " of big benefit language, and make stop signpost 3804A seem pronounce word " arresto " rather than "stop".In embodiments, eyepiece can also provide simple translation message to wearer, provide audio translation, mention to wearer For translation dictionary etc..The disclosure may include a kind of interaction wear-type eyepiece worn by user, and wherein eyepiece includes that user passes through It is watched ambient enviroment and the optics assembly of content shown and is suitable for introducing content into the integrated of optics assembly Image source;For the integrated camera to the text imaging seen in ambient enviroment;For will from the text watched one A or multiple characters are related to one or more characters of first language and by one or more characters of first language and The relevant optical character recognition equipment of one or more characters of two language, wherein integrated image source by one of second language or Multiple characters are rendered as the content shown, wherein the content shown is locked in relative to from the text watched In the position of one or more characters.The presentation of one or more characters of second language can behave as the annotation to user, and The content shown is placed as relative to the original text watched.The presentation of one or more characters of second language can fold It is added in the viewing location of the original text watched, one or more characters of such as second language are superimposed on original seen Presentation on the text seen matches the character feature of the original text watched.The text watched is located at direction board, printing text On shelves, books, road sign, billboard, menu etc..Optical character recognition equipment can be incorporated into eyepiece, mention outside eyepiece For or by provide in a manner of internal and external combination.The one or more character can be word, phrase, alpha-numeric string Deng.One or more characters of second language can be saved in external equipment and tagged to watch in the second eyepiece One text Shi Keyong tags including geographical location instruction, object identifier etc..In addition, the view when text removes The presentation of one or more characters of second language can be stored when except the view of eyepiece, so that when text is moved back into eyepiece View within when its for present purpose and recalled to.
In one example, eyepiece can be used in adaptive advertisement, such as blind users.In embodiments, Face recognition or the result of object identity can be processed to obtain audible as a result, and can be used as audio by associated ear Plug/earphone is presented to the wearers of glasses.In other embodiments, face recognition or the result of object identity can be converted into Tactile vibrations in glasses or associated controller.In one example, if someone stands in face of the user of adaptive glasses, phase Machine can be imaged to the people and send image to integrated processor and handle for use by facial recognition software, or sends work to and taking The facial recognition software being engaged on device or in cloud.The result of face recognition can be rendered as the display of glasses for certain individuals Penman text in device, but for blind person or weak-eyed user, it as a result can be processed to obtain audio.In other examples, Object identifying can determine user close to roadside, doorway or other objects, and glasses or controller will acoustically or tactile Alert user.For weak-eyed user, the text on display can be amplified or contrast can be enhanced.
In embodiments, GPS sensor can be used to determine the position for wearing the user of adaptive display.GPS is passed Sensor can audibly notify user when user is close to or up to various points of interest by navigation application access.In each embodiment In, by navigation application, user is audibly directed to terminal.
Eyepiece may be useful for various applications and market.It should be understood that controlling mechanism as described herein can by with In the function of controlling application as described herein.Eyepiece can once run single application or can once run multiple applications.Using it Between switching can be carried out with controlling mechanism as described herein.Eyepiece can be used for military application, game, image recognition apply with Check/subscribe e-book, GPS navigation (position, direction, speed and ETA(Estimated Time of Arrival)), mobile TV, sport (check step Speed, ranking, match number;Receive coach teach), tele-medicine, industrial detection, aviation, shopping, stock control tracking, fire-fighting (by understand thoroughly smog, haze, dark VIS/NIRSWIR sensor enable), open air/venture, customization advertisement etc..In an embodiment In, eyepiece can with the Email of GMAIL in Fig. 7 etc, internet, web-browsing, check sports score, Video chat etc. It is used together.In one embodiment, eyepiece can be used for education/training goal, such as (such as be exempted from by display substep guidance It mentions, wireless maintenance and repairing indicate).For example, manual video and/or instruction are displayed in the visual field.In one embodiment, mesh Mirror can be used for fashion, health & beauty.For example, it may be possible to suit, hair style or cosmetics can be projected onto the mirror of user As upper.In one embodiment, eyepiece can be used for business intelligence, talks and meeting.For example, the nametags of user can be swept It retouches, their face is searched in the database to obtain biology quickly through facial-recognition security systems or name that they say Information.Nametags, face and session through scanning, which can be recorded, to be checked or filters for subsequent.
In one embodiment, " mode " can be entered by eyepiece.In this mode, specific application may be available.For example, The eyepiece of consumer's version can have tourist's mode, educational pattern, internet mode, TV mode, game mode, motor pattern, Designer's mode, personal assistant mode etc..
The user of augmented reality glasses may want to participate in video call or video conference while wearing spectacles.It is many All there is computer (both desktop computer and laptop computer) integrated camera to be easy to use video call and meeting.Allusion quotation Type, software application be used to for the use of camera being integrated with calling or conference features.By being provided in augmented reality glasses On knee and other calculating equipment most of functions, many users may want to while wearing the movement of augmented reality eyes Utilize video call and video conference.
In one embodiment, video call or video conference application can connect work with WiFi, or can be with A part of the associated 3D or 4G call network of the cellular phone of user.Camera for video call or meeting, which is placed in, to be set On preparation controller, such as wrist-watch or other individual electronic computing devices.It is existing that video call or conference camera are placed on enhancing It is unpractical on real glasses, because this placement can only provide a user themselves view, without display conference Or other participants in calling.However, user may be selected to show their environment or video using the camera of face forward Another individual in calling.
Figure 32 is depicted for the typical camera 3200 used in video call or meeting.This camera be usually it is small, And it can be installed on wrist-watch 3202 shown in such as Figure 32, on cellular phone or including laptop computer On other portable computing devices.Video call with cellular phone or other communication equipments by connecting device controller come work Make.The operating system and communication equipment of equipment utilization and glasses or the software for calculating hardware compatibility.In one embodiment, enhance The screen of Reality glasses can show the list of the option for making calling, and user can be used pointing control device or use this Any other control technology described in text carrys out the video call option on the screen of selective enhancement Reality glasses to make posture.
Figure 33 shows the embodiment 3300 of the block diagram of video call camera.Camera combines lens 3302, CCD/CMOS Sensor 3304, the analog-digital converter 3306 for vision signal and the analog-digital converter 3314 for audio signal.Microphone 3312 acquisition audio inputs.Both their output signal is sent to signal enhancing module by analog-digital converter 3306 and 3314 3308.Enhanced signal is transmitted to interface 3310 by signal enhancing module 3308, which is video and audio The synthesis of both signals.Interface 3310 is connected to IEEE1394 standard bus interface and control module 3316.
In operation, video call camera depends on signal capture, and incident light and incident sound are transformed into electronics by it.It is right Yu Guang, the process are executed by CCD or CMOS chip 3304.Sound is become electric pulse by microphone.
The first step in generating the process of the image of video call is by image digitazation.CCD or CMOS chip 3304 subdivision images are simultaneously converted pixel.If many photons of a certain pixel collection, voltage will be height.If the picture Element has collected seldom photon, then voltage will be low.The voltage is the analogue value.During digitized second step, voltage by into The analog-digital converter 3306 of row image procossing is transformed into digital value.At this point, can get original digital image.
The audio that microphone 3312 captures also is transformed into voltage.The voltage is sent to analog-digital converter 3314, there The analogue value is transformed into digital value.
Next step is enhancing signal, so that it can be sent to the viewer of video call or meeting.Signal enhancing Color is created in the picture including using the colour filter for being located at 3304 front CCD or CMOS chip.The colour filter be it is red, green or Indigo plant simultaneously changes its color pixel by pixel, it can be color filter array or Bayer color filters in one embodiment.These original numbers Then word image is enhanced by the colour filter to meet aesthetic requirement.Audio data can also be enhanced to obtain preferably calling body It tests.
In final step before transmission, image and audio data are compressed and are exported as digital video frequency flow, one Digital video camera is used in embodiment.If using photo camera, exportable single image, and in further embodiment In, speech can be commented on and be additional to these files.Camera is left in the enhancing of initial numberical data, in one embodiment can be Augmented reality glasses occur in the device controller that video call or session communicate or calculating equipment.
Further embodiment can provide portable camera and be used for industry, medical treatment, astronomy, microscopy, require specialized cameras The other field of purposes.These cameras usually abandon signal enhancing and export original digital image.These cameras may be mounted to that On other electronic equipments or user on hand in order to using.
Camera using IEEE1394 interface bus come with augmented reality glasses and device controller or calculate equipment interconnection.It should Interface bus delivery time demanding data, such as video and the extremely important data of its integrality, including to manipulate number According to or transmission image parameter or file.
Other than interface bus, the behavior of protocol definition equipment associated with video call or meeting.In each implementation In example, one of following agreement: AV/C, DCAM or SBP-2 is can be used in the camera for being used together with augmented reality glasses.
AV/C is the agreement for audio frequency and video control, and defines the number including video camera and video recorder The behavior of video equipment.
DCAM refers to the digital camera specification based on 1394, and definition without audio exports the phase of uncompressed image data The behavior of machine.
SBP-2 refers to serial bus protocol, and the mass-memory unit of definition such as hard disk drive or disk etc Behavior.It can be communicated with one another using the equipment of same protocol.To be exhaled for the video for using augmented reality glasses to carry out It cries, identical agreement can be used in the video camera and augmented reality glasses on device controller.Due to augmented reality glasses, equipment Controller and camera use identical agreement, and data can exchange among these devices.The file packet that can be transmitted between devices It includes: image and audio file, image and voice data stream, the parameter for controlling camera etc..
In one embodiment, it is desirable to which the user for initiating video call can select from the screen presented when initiating video call Video call option.User makes posture by using pointing device to select, or postures to signal and exhale video Cry the selection of option.Then the camera being located on device controller, watch or other separable electronic equipments is positioned to by user So that the image of user is by cameras capture.Image is processed by above-mentioned process, be then streamed to augmented reality glasses with And other participants are for display to user.
In embodiments, camera may be mounted to that cellular phone, personal digital assistant, watch, in falling decoration or can be by On other small portable apparatus for carrying, wearing or installing.The image or video of cameras capture can be streamed to eyepiece.Example Such as, when camera is installed on rifle, wearer perhaps to target imaging not within view and can wirelessly receive figure As the stream as the content shown to eyepiece.
In embodiments, the disclosure can provide the content reception based on GPS to wearer, as shown in Figure 6.As described, The augmented reality glasses of the disclosure may include memory, global positioning system, compass or other orientation equipments and camera. The available computer program based on GPS of wearer may include that can usually obtain from the application shop of Apple Inc. for iPhone The many applications used.These programs of counterpart can be used for the smart phone of other brands, and can be applied to the disclosure Each embodiment.These programs include such as SREngine(scene Recognition engine), NearestTube, TAT Augmented ID, Yelp, Layar and TwittARound, and other more specialized applications of such as RealSki etc.
SREngine is the scene Recognition engine of object observed by the camera that can be identified for that user.It is can to identify The software engine of the static scene of the scenes such as building, structure, photo, object, room etc.Then it can be according to it Automatic incite somebody to action identified virtually " label " be applied to structure or object.For example, the program can be by the user of the disclosure in viewing street It is called when road scene, such as Fig. 6.Using the camera of augmented reality glasses, which will identify the Fontaines de la in Paris Concorde(Place de la Concorde fountain).Then the program will call out a virtual label, project eyeglass 602 as being used as in Fig. 6 On virtual image 618 a part shown in like that.The label can be only text, as being seen the bottom of image 618 Like that.Other labels that can be applied to the scene may include " fountain ", " museum ", the cylindric building in " hotel " or back Title.Other such programs may include Wikitude AR Travel Guide, Yelp and many other programs.
NearestTube for example directs the user to the nearest subway station in London, other programs using identical technology Same or similar function can be executed in other cities.Layar is come using camera, compass or direction and GPS data The position of identity user and the another application in the visual field.Using the information, covering or label can virtually occur to help to orient With guidance user.Yelp executes similar function with Monocle, but their database is more detailed to a certain extent, It helps to guide user to restaurant or other service providers in a similar way.
Any control described in this patent can be used to control glasses and call these functions in user.For example, glasses Can be equipped with the microphone of pickup voice commands from the user, and it is handled using the software that the memory of glasses is included ?.Then user can make a response the prompt from the miniature loudspeaker or earplug that are also contained in eye glass frame.Glasses It can also be equipped with small Trackpad, similar to those of finding small Trackpad on smart phone.Trackpad allows user in AR Mobile pointer or indicator, are similar to touch screen on virtual screen in glasses.When user reaches the desired point on screen, use Family presses Trackpad to indicate his or her selection.To, user can caller, such as guide-book, if then by Dry vegetalbe list may is that find his or her road and select country, city then classification.Classification selection may include such as hotel, purchase Object, museum, restaurant etc..User makes his or her selection, then by AR program designation.In one embodiment, glasses are also Including GPS locator, and provide can substituted default location for current countries and cities.
In one embodiment, the object recognition software of eyepiece can handle figure received by the camera of the face forward of eyepiece As determining in the visual field what has.In other embodiments, may be enough by the GPS coordinate of the position determined the GPS of eyepiece Determine in the visual field what has.In other embodiments, the RFID in environment or other beacons can broadcast locations.Above-mentioned any one Kind or combination can be identified the position of thing and identity in the visual field by eyepiece use.
When object is identified, the resolution ratio of object imaging can be increased, or figure can be captured with little compressible Picture or video.In addition, the resolution ratio of other objects in the visual field of user can be lowered, or be captured with more high compression rate, In order to reduce required bandwidth.
Once being determined, content related with the point of interest in the visual field can be superimposed in real world image, such as society Hand over Web content, interactive travelling, local information etc..With film, local information, weather, restaurant, restaurant availability, local thing The related information such as part, local taxi, music and content can be accessed by eyepiece and be incident upon on the eyeglass of eyepiece to be seen for user It sees and interacts.For example, the camera of face forward can shoot image and send it to mesh when user sees Eiffel Tower The association processor of mirror is handled.Object recognition software can determine that the structure in the visual field of wearer is Eiffel Tower.Substitution Ground can search for GPS coordinate determined by the GPS of eyepiece in the database to determine the coordinate of the coordinate matching Eiffel Tower. Under any circumstance, it then can search for and the information of Eiffel Tower visitor, the restaurant nearby and in tower itself, local day The related contents such as gas, local subway information, local hotel information, other neighbouring tourist spots.Control as described herein can be passed through Mechanism allows the interaction with the content.In one embodiment, the content reception based on GPS can be in the tourism pattern for entering eyepiece When be activated.
In one embodiment, eyepiece can be used for watching streamed video.For example, can by by GPS location search, Video is identified by the search of the Object identifying of object, voice search, holographic keyboard search etc. in the visual field.Continue Eiffel Tower Example, can through the GPS coordinate of tower or when having determined that it is exactly the structure in the visual field according to word " Eiffel Tower " come Search for video database.Search result may include the video or video associated with Eiffel Tower for adding geographical labels.Pass through It can be rolled using control technology as described herein or turning video.It can play interested view using control technology as described herein Frequently.Video can be superimposed in real-world scene, or can be shown on eyeglass out of sight.In one embodiment, can pass through Mechanism as described herein keeps eyepiece dimmed to allow the viewing of more high contrast.In another example, eyepiece perhaps can utilize Camera and all network connections as described herein to provide streamed video conference capabilities to wearer.Streamed video can To be the video of at least one other video conference participants, visual presentation etc..Streamed video can be by certainly when being captured It is dynamic to upload to video storage location, without passing through the interaction of eyepiece user.Streamed video can be uploaded to physics or virtual Storage location.Virtual storage location can be located at single physical position or cloud storage position.The video of streamed video conference is also It can be modified by eyepiece, wherein modification can be inputted based on sensor.Sensor input can be visual sensor input or audio passes Sensor input.Visual sensor inputs the image that can be another participant, visual presentation of video conference etc..Audio sensor Input can be the speech of a certain participant of video conference.
In embodiments, eyepiece can provide receiving from such as smart phone, plate, personal computer, amusement equipment, The external equipment of the happy video equipment of portable audio, household audio and video system, home entertainment system, another eyepiece or the like it is wireless The interface of streaming media (such as video, audio, text messaging, call and schedule warning).Wireless streaming media It can connect by any wireless communication system known in the art and agreement, such as bluetooth, WiFi, wireless home network, is wireless Local area network (WLAN), wireless family digital interface (WHDI), cellular mobile telecommunication etc..Multiple wireless communications systems also can be used in eyepiece System, such as one for flowing, transmission High Data Rate media (such as video), one for low data rate media, (such as text message is received Hair), one for order data between external equipment and eyepiece etc..For example, High Data Rate video can be through WiFi DLNA The transmission of (digital real-time network alliance) interface incoming flow, and bluetooth is used for the low data-rate applications of such as text messaging etc. In embodiments, external equipment can be provided that the application docked of the support with eyepiece.For example, may make a mobile application for User is used to dock their smart phone with eyepiece.In embodiments, external equipment, which can be provided that, docks with eyepiece Transmission device.Such as, it is possible to provide transmitter adapter (dongle) docks the smart phone of user with eyepiece.Due to from outer Many processing requirements may be applied to external equipment by portion's equipment streaming media, and eyepiece may require less onboard processing Ability adapts to streaming media.For example, an embodiment of the eyepiece for adapting to streaming media includes for receiving to spread Media, buffered data are sent, streaming media is supplied to user passes through its optics for watching ambient enviroment and the content shown The interface of assembly etc..That is, an embodiment of the eyepiece for receiving streaming media can be mesh as described herein The simple version of the other embodiments of mirror, to be used as the display of external equipment.In one example, user perhaps can will regard Frequency is streamed to the eyepiece of " simple version " from their smart phone.However, it will be understood by those skilled in the art that described herein Any other function also be included in create the eyepiece of each embodiment version, from display circle for functioning only as external equipment The eyepiece of the most simple version in face transmits interface only to the version for the ability for including gamut as described herein, such as wireless streams It is one of multiple functions and the ability that eyepiece provides.For example, even if being controlled as described herein in the eyepiece of more simple version Technology processed, power-saving technique, using, drive with streaming media one or two display, shown etc. with 3D mode and be also likely to be It is useful, so as in the command mode of streaming media, for battery management, optional media watching mode for increasing the service life etc. Middle offer help.Alternatively, the eyepiece of super simple version can provide the embodiment of the cost and complexity minimum of eyepiece, The case where interface between such as external equipment and eyepiece is wireline interface.For example, an embodiment of eyepiece can be in the intelligence of user Wireline interface can be provided between phone or plate and eyepiece, wherein can be restricted to only now will be streamed for the processing capacity of eyepiece Media presentation supplies to watch processing required by content on the eyeglass of eyepiece to optics assembly.
In other embodiments, the application operated on smart phone may act as the remote input apparatus of glasses.For example, The user interface of such as keyboard etc allows user by smart phone come typing character.The application will make phone seem As bluetooth keyboard.The application can be only that will touch the full frame sky for being transmitted to the pseudo- touch screen driver run on glasses White application, enabling a user to use smart phone to carry out two fingers scaling (pinch) and towing as completion, these are moved Actual physics place, and obtain to the visual feedback in the touch feedback and glasses of your hand.To the benefit run on glasses It can be worked in the case where user is using smart phone touch screen well with more common applications of the input gestures of these types. Command information can be accompanied by visual detector.For example, in order to when you just control glasses or eyewear applications using external equipment Know you finger where, command information it is visual instruction be displayed in glasses, the highlighted track of such as finger movement. The disclosure may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes that user passes through its viewing ambient enviroment and display The optics assembly of content out is suitable for introducing content into the integrated image source of optics assembly;Integrated processor;Outside is set Standby, which has physical user interface and external equipment is become to the use that can be operated by integrated processor of eyepiece The application at family interface is instructed in the content shown wherein interacting with the physics of external equipment.In embodiments, outside Portion's equipment can be smart phone, plate, mobile navigation equipment etc..Physical user interface can be keypad, touch tablet, control Interface etc..For example, physical interface can be iPhone, and the content shown is to show the user action on iPhone keypad The movement being shown as on virtual keypad, the content shown on eyepiece dummy keyboard, such as when the finger movement of user is real Highlighted key, key press instruction etc. are shown on virtual keypad when interacting on border with the physical keypad of iPhone.Hand Finger movement can be the selection to content and one of the movement of content to showing.The manipulation can be multiple on touch tablet Finger movement such as readjusts the two fingers scaling manipulation of the size of the content shown on eyepiece.
As described above, the user of augmented reality can be from a large amount of source reception contents.Visitor or tourist may want to selection office It is limited to local businessman or mechanism;On the other hand, want that obtaining visitor or the businessman of tourist may want to the supply by them or solicit It is limited to the people that local was being visited but had not been in region or position in them.To, in one embodiment, visitor Or his or her search may be only limitted to local businessman by tourist, i.e. businessman those of in specific geographic limitation range.These limits System can indicate geographical limitation to be arranged by GPS criterion or manually.For example, someone may require stream content Or the source of advertisement those of is restricted in the certain radius (setting quantity or kilometer or mile) of the people source.Alternatively, The criterion can require source to be limited to town or those of inside the province source.These limitations can be by augmented reality user just as computer User is at home or office will use keyboard and the his or her search of mouse limitation is arranged like that;Augmented reality user's is defeated Enter other modes described in the part simply by each control of discussion of speech, hands movement or the disclosure to make.
In addition, the available content of user's selection can be carried out restrained or limitation by the type of supplier.For example, user can will select Select the website (.gov) being limited to by government organs' operation or by nonprofit institution or the website (.org) of tissue operation Those of.In this manner it is possible to can send out the more interested tourists such as government bodies, museum, historic site or visitor is visited Existing his or her selection is less mixed and disorderly.When available selection has been reduced to more reasonable quantity, which may be more It is readily able to make decision.Quickly abatement can use presence of the ability of selection in such as Paris or Washington D.C. etc many It is desirable in more urban areas of selection.
User by this patent elsewhere it is described it is any in a manner of or mode control glasses.For example, user can pass through words Sound recalls required program or application by indicating selection on the virtual screen of augmented reality glasses.Enhance glasses The Trackpad being mounted on eye frame can be made a response, as described above.Alternatively, glasses may be in response to be mounted on frame One or more movements or position sensor.Signal from sensor is then communicated to the microprocessor or micro- in glasses Controller, glasses additionally provide any desired signal conversion or processing.Once the program of selection starts, user just passes through herein Any method makes a choice and inputs response, such as passes through head movement, gesture, Trackpad pressing or speech life It enables to signal "Yes" or "No".
Meanwhile content provider (i.e. advertiser) may also wish their supply being limited to specific geographical area (such as Their city scope) in people.Meanwhile advertiser's (may is that museum) may not want that and provide content to local People, but may want to touch visitor or stranger.In another example, advertisement can not be presented when the user is at home, but worked as and used Advertisement is presented when family travels or is away from home.Augmented reality equipment discussed in this article is preferably provided with GPS ability and telecommunications energy Power, and for realizing the integrated processor based on geographical rule presented for advertisement.Pass through limit for museum Its broadcasting power is made to provide streamed content in finite region will be a simple thing.However, museum may lead to It crosses internet and content is provided, and its content may can get in the world.In this example, user can be set by augmented reality Standby reception content is opened the door and to be apprised of for visit museum's today.
User can make a response the content by the augmented reality equivalent of the link in click museum.Augmented reality Equivalent can be other sensing instructions of speech instruction, hand or eyes movement or user's selection or by using associated The controller of installation physically.Then museum receives indicates that the identity of user or the Internet service of at least user provide The cookie of quotient (ISP).If the cookie indicates or suggests that the Internet service other than local provider provides Quotient, then museum's server can be used as the special advertisement of visitor or offer then to respond.The cookie may also include electricity Believe the instruction of link, such as telephone number.If telephone number is not local number, this is that the people made a response is visitor Additional clue.Museum or other mechanisms then can be desired by its market departments or the content suggested continues.
The ability of user is utilized to control eyepiece and its tool, wherein to user hand in the another application of augmented reality eyepiece Use it is minimum, then use voice commands, posture or movement.As described above, user can require augmented reality eyepiece retrieval letter Breath.The information may have been stored in the memory of eyepiece, but be alternatively likely located at it is long-range, such as by internet or Perhaps via only can be by the addressable database of Intranet of specific company or the employee access of tissue.Eyepiece so as to one calculate Machine mention in the same breath or with can by extremely close range watch and hear and usually in the case where the hand to people is using least situation come The display screen of control is mentioned in the same breath.
Using so as to including that field data is supplied to machinist or Electronic Installation Technician.Technician can be in search about spy When determining information that is structure or being encountered problems, such as when repairing engine or power supply, put on one's glasses.Using voice commands, he Or she then accessible database and searches for specific information in database, such as handbook or other R and M documents.Institute The information needed is so as to be accessed at once and least energy is spent to be applied, so that it is required to allow technician to more quickly perform It repairs or safeguards and equipment is made to restore to use.For key task equipment, other than saving repairing or maintenance cost, when this Between save be also possible to save somebody's life.
Information to be administered may include repair manual etc., but may also include the audio-visual information of gamut, i.e., in technician Or machinist attempt execute particular task while, eyepiece screen can illustrate how to execute the video of the task to the individual.Increase Strong real world devices further include telecommunication capability, therefore technician also has has certain complexity or unexpected difficulty in task In the case of request other people the ability that helps.Maintenance and repair are not limited in terms of this education of the disclosure, but can It is applied to any educational undertaking, intermediate classroom, advanced classroom, teacher's training courses or theme, seminar etc..
In one embodiment, the eyepiece for enabling WiFi can run location-based application to carry out the user that selection is participated in Geo-location.User can be by logging in the application on their phone and allow to broadcast their position, or by them Geo-location is enabled on the eyepiece of oneself to select to participate in.When the wearer of eyepiece scans people and therefore scans their choosing When selecting the equipment of participation, which can identify the user that selection is participated in and send the commands to projector by augmented reality indicator It projects on the user that a certain selection in the user visual field is participated in.For example, can place green ring around the selected people participated in It is seen their position.In another example, yellow ring can indicate that the selected people participated in but be unsatisfactory for certain criterion , such as they do not have FACEBOOK account or there is no mutual in the case where they have FACEBOOK account Friend.
Certain social networkings, occupation networking and appointment application can be with location-based application cooperations.It resides on eyepiece Software tunable from network and appointment website and location-based application data.For example, TwittARound is exactly one A such program, it, which is detected and is marked using the camera of installation, is marked the micro- of position from other neighbouring micro-blogs It is rich.This will make using the disclosure individual can position near other Twitter(push away spy) user.Alternatively, Yong Huke Their equipment can be must be provided with to coordinate the information from various networkings and appointment website.For example, the wearer of eyepiece may Want to see all E-HARMONY users for the position for broadcasting them.If the user that a certain selection is participated in is by eyepiece mark Know, then augmented reality indicator can be coated on the user of selection participation.If the user and wearer have some common Place and user are there are many something in common etc., then different appearances can be presented in the indicator.Such as and with reference to Figure 16, wearing Person is observing two people.The two people are identified as E-HARMONY user by the ring being placed around.However, with Woman shown in solid line ring has that at least one is identical as wearer, and the woman shown in dotted line ring and wearer without it is common it Place.Any available profile information can all be accessed and be shown to user.
In one embodiment, when wearer by eyepiece be directed toward have such as FACEBOOK, TWITTER, BLIPPY, When the direction of the user of the networking account such as LINKEDIN, GOOGLE, WIKIPEDIA, the nearest model of user or profile information can It is displayed to wearer.For example, nearest state updates, " tweets " (is pushed away as described in above with respect to TwittARound It is special), " blips " etc. (a kind of shopping sharing microblogging) can be shown.In one embodiment, it is used when eyepiece is directed toward target by wearer When the direction at family, swashed if eyepiece was directed toward up to duration a period of time and/or a certain posture, head, eyes or audio frequency control Living, then they can indicate that interested in the user.Target user can receive emerging in their phone or their glasses Interest instruction.If wearer is represented interest labeled as interested but waiting wearer by target user first, can stand The instruction of the interest about target user is popped up i.e. in eyepiece.A kind of controlling mechanism can be used to capture image and use target The information at family be stored on associated nonvolatile memory or online account in.
In the other application of social networking, such as TAT-The Astonishing Tribe from Sweden Ma Ermo The face recognition program of the TAT Augmented ID of company etc can be used.This program can be used for the face according to people Feature identifies people.The software identifies individual using facial recognition software.It is soft using the photo id such as from Flickr The other application of part etc, people can then identify specific individual nearby, and people then can be from having about the individual Information social networking site download information.The information may include the individual in such as Facebook, Twitter or the like The name and profile of the available individual are provided on website.The application can be used for refreshes user to some man memory or mark Someone near knowing, and collect the information about the individual.
In the other application for social networking, wearer perhaps can using the location-based equipment of eyepiece come Each place leaves annotation, commentary, comment etc. in association, in each place, to each product etc. with people.For example, someone is perhaps The a certain place that can be accessed him is posted comment, wherein then may make this is posted to be obtained by social networks by other people ?.In another example, someone perhaps can post the comment at the position in the place, so that when another people comes this It can get the comment when position.By this method, wearer perhaps can access when they come the place and be left by other people Comment.For example, wearer may come the entrance in a certain restaurant and be able to access that the comment about the restaurant, such as according to certain One criterion is come sort (age of such as nearest comment, commentator).
User can select simultaneously by speech, by making a choice as described above from virtual touch screen, by using Trackpad Selected desired program initiates required program by any control technology as described herein.Then it can be used similar Or complementary mode make menu selection.The sensor or input equipment for being mounted on the convenient location of user's body can also quilts It uses, is such as mounted on wrist guard, sensor and Trackpad on gloves, or even separate devices, perhaps there is intelligence electricity The size of words or personal digital assistant.
The application of the disclosure can provide access to the Internet to wearer, such as browsing, searching for, doing shopping, entertaining, such as By the wireless communication interface for arriving eyepiece.For example, wearer can initiate Web search with control posture, such as by being worn on On a certain position of wearer's body (as on hand, on head, on foot), wearer's a certain component (such as individual calculus currently in use Machine, smart phone, music player) on, the first-class control of a piece of furniture (such as chair, desk, desk, desk lamp) near wearer Control equipment, wherein the image of Web search, which is projected, is watched for wearer by eyepiece.Then wearer can be checked by eyepiece and be searched Rope simultaneously controls web interaction by control equipment.
In one example, user may adorn oneself with according to a pair of glasses configure a certain embodiment, wherein project because The image of special net web browser is provided by eyepiece, while retaining while checking at least some of around actual environment Ability.In this example, user may adorn oneself on hand motion-sensing control equipment at them, and wherein the control equipment can It sends the hand of user and control command for web control is used as the relative motion of eyepiece, such as counted similar to conventional personal Mouse in the configuration of calculation machine.It is understood that, it will so that user is executed with configuring similar mode with ordinary personal computer Web movement.In this case, the image of Web search is provided by eyepiece, and the control of the selection to the movement for executing search It is provided by the movement of hand.For example, the mass motion of hand can move cursor, finger in the image of Web search projected Flick can provide selection movement etc..By this method, by being connected to the embodiment of internet, wearer may make to be able to carry out The function of required Web search or any other enabling explorer.In one example, user may be From App Store(application shop) computer program Yelp or Monocle have been downloaded, or such as be used to position from Zagat NRU(" near you " is applied in nearby restaurant or other shops), the similar products of Google Earth, Wikipedia or the like. The individual can initiate for example to restaurant or other commodity or ISP (such as hotel, mechanic or the like) or information Search.When the information required for finding, position is shown or shown to the distance of desired locations and direction.Display can adopt With the form of virtual label co-located with real world objects in User.
From Layar(Amsterdam, the Netherlands) other application include for special each of specific information desired by user Kind " layer ".Layer may include restaurant information, the information about specific company, real estate listings, gas station etc..It is answered using such as movement The global positioning system (GPS) of the information and user that provide in software application, information can be present on the screen of glasses, With the label with desired information.The Tactile control discussed using disclosure other places or other controls, user can press axis Rotation otherwise rotates his or her body, and watches the building for being marked with the virtual label comprising information.If User searches for restaurant, and screen will show the restaurant information of such as Name & Location etc.If user searches for particular address, empty Quasi- label will appear on the building in the user visual field.User then can by speech, by Trackpad, pass through virtual touch Screen etc. is selected or is selected to make.
The application of the disclosure can provide a kind of mode by advertisement delivery to wearer.For example, when viewer start he or She one when, while browsing internet, while carrying out Web search, when walking to duck into the store etc., advertisement can pass through mesh Mirror is displayed to viewer.For example, user may be carrying out Web search, and become the mesh of advertisement by Web search user Mark.In this example, advertisement can be projected in the same space of projected Web search, float to side or be located at and wear On or below the visual angle of wearer.In another example, when a certain advertisement provides equipment (may is that one near wearer) When sensing presence (such as by wireless connection, RFID) of eyepiece, triggerable advertisement guides advertisement to be delivered to eyepiece To eyepiece.In embodiments, eyepiece can be used for track advertisement interaction, user see billboard, promotion, advertisement etc. or with Interact.For example, user can be tracked about the behavior of advertisement, for providing a user benefit, remuneration etc..One In example, whenever user sees billboard, user is paid for 5 dollars of ideal money.Eyepiece can provide impression tracking, such as based on See brand image (such as based on time, geography).As a result, offer can be based on position and event related with eyepiece (as used What is seen, what is heard, is interacted with what in family) determine target.In embodiments, advertising objectiveization can be based on history What behavior interacted, interactive mode etc. based on user's past with.
For example, wearer may go window-shopping in Manhattan, wherein shop is equipped with this series advertisements and provides equipment.Work as wearer It is out-of-date to walk by shop, and advertisement is provided equipment and can be used based on determined by the integrated position sensor of such as GPS etc of eyepiece The known location at family triggers delivery of the advertisement to wearer.In one embodiment, the position of user can further pass through other Integrated sensor (such as magnetometer) refines, allowing the augmented reality advertisement of super localization.For example, if magnetometer and When user is located in front of certain shops by GPS reading, the user on the bottom of market can receive particular advertisement.When user is in quotient In when upper layer, GPS location can keep identical but magnetometer readings can indicate that the variation of user's height above sea level and user exist New definition before one different shops.In embodiments, personal profiles information can be stored, so that advertisement offer equipment can be more The demand of advertisement and wearer are matched well, wearer can provide the preference about advertisement, and wearer can prevent at least one A little advertisements etc..Advertisement and associated discount perhaps can also be passed to friend wearer and can directly pass them by wearer Up to nearby and those of eyepiece for enabling themselves friend;They can also pass through the social networks etc to such as friend Wireless Internet connection, by Email, SMS convey them;Etc..Wearer can be connected to the following reception and registration of permission Equipment and/or infrastructure: the advertisement from sponsor to wearer;From wearer to advertising equipment, ad sponsor etc. Feedback;Someone near to other users (such as friend and kinsfolk) or wearer;To shop, such as eyepiece Local or Remote Website (such as on the internet or on the home computer of user);Etc..These interconnectivity equipment may include the use of eyepiece The position of user and the integrated equipment of direction of gaze are provided, such as by using GPS, 3 axle sensors, magnetometer, gyroscope, Accelerometer etc., for determining direction, the speed, attitude (such as direction of gaze) of wearer.Interconnectivity equipment can provide telecommunications and set It is standby, cellular link, WiFi/MiFi bridge etc..For example, wearer perhaps can be linked by available WiFi, pass through honeycomb MiFi (or any other people or group cellular link) of the system integration etc. is communicated.There may be wearers for storing advertisement Equipment for using later.May be present with the eyepiece of wearer it is integrated or be located locally in computer equipment allow to advertisement into The equipment of row cache, such as in local zone, wherein cached advertisement allow wearer it is close with this Advertisement is delivered when the associated position of advertisement.For example, local advertising can be stored in comprising through geo-location local advertising and On the server of bargain goods, and when wearer is close to specific position, these advertisements can be individually delivered to wearer, Or when wearer enters geographic area associated with advertisement, one group of advertisement can be delivered in batches wearer, so that Obtaining can get these advertisements when user is close to specific position.Geographical location can be city, a part in city, several streets Area, single block, street, a part in street, pavement etc. represent provincialism, local, super local zone.Note that above-mentioned Discuss use term advertisement, skilled person will understand that this also mean bulletin, broadcast, leaflet, commercial advertisement, There is the communication of patronage, criticize list, notice, promotion, notification, message etc..
Figure 18-20A, which is depicted, is delivered to the short of the facility (such as retail shop) for wishing to send message for that will customize message The mode of people in distance.Referring now to Figure 18, each embodiment can provide such as when the wearer of eyepiece is walking or drives When it is above-mentioned for search for cargo and service supplier application come check customization notice board by way of.Such as Figure 18 Middle description, notice board 1800 shows the exemplary advertisement based on augmented reality shown by seller or service provider.Such as Discribed, which can be related with the drink offer provided by bar.For example, only the cost of a drink can mention For two portions of drinks.Advertisement based on augmented reality and offer in this way, the attention of wearer can be easily guided to Notice board.The details of the position in relation to the bar, such as street address, floor number, telephone number can also be provided in notice board Deng.According to other embodiments, the several equipment other than eyepiece can be utilized to check notice board.These equipment may include But be not limited to smart phone, IPHONE, IPAD, windshield, user's glasses, the helmet, wrist-watch, earphone, vehicle-mounted bracket etc.. According to an embodiment, when user (wearer in the case where augmented reality is embedded in eyepiece) passes through or drives to pass through Offer can be automatically received when the section or checks the scene of notice board.According to another embodiment, user can be according to his request To receive offer or check the scene of notice board.
Figure 19 shows two illustrative roadside notice boards 1900, they include that can be checked with augmented reality mode Offer and advertisement from seller or service provider.Live and close reality can be provided to user or wearer by enhancing advertisement Perception.
As shown in Figure 20, the equipment (camera lens provided in such as eyepiece) for enabling augmented reality can be used to connect It receives and/or checks and be displayed at roadside or building and shop top, side, the scribble 2000 in front, poster, drawing etc.. Roadside notice board and scribble can have advertisement or advertising database can be linked to notice board visual indicator (for example, code, Shape) or wireless pointer.When wearer is close and checks notice board, the projection of billboard advertisement is then provided to pendant Wearer.In embodiments, personal profiles information can also be stored so that advertisement can preferably match the needs of wearer, worn Person can provide the preference for advertisement, and wearer can prevent at least some of advertisement etc..In embodiments, eyepiece can be for notice The eyepiece view field of plate controls with brightness and contrast, to be promoted such as under bright external environment to the readable of advertisement Property.
In other embodiments, user can be according to the GPS location or other positions indicator (such as magnetometer of specific position Reading) come in the specific location posted information or message.When some of expected viewer in the position apart from it is interior when, this is looked into The person of seeing is it can be seen that the message, as combined Figure 20 A to explain.In the first step 2001 of method 20A, user determines a position, Message will be received by the people as the message sending object at this location.Message is then posted 2003, to connect in recipient One or more people appropriate are sent to when nearly expected " checking region ".The position of the wearer of augmented reality eyepiece is by structure 2005 are continually updated at the GPS system of eyepiece a part.When GPS system determine wearer be located at expectation check region certain When a distance (for example, 10 meters) is interior, message is then sent 2007 to viewer.In one embodiment, message then shows For the Email or text message to recipient, or if recipient just adorns oneself with eyepiece, which be may alternatively appear in eyepiece. Since message is sent to the people according to the position of people, in some sense, message can be displayed as in specified location or lean on The building or " scribble " in feature of the nearly designated position.It is all by " checking region " that specific setting, which can be used to determine, People still only specific people or group or the equipment with unique identifier can see message.For example, having checked one The soldier in village can by by message or identifier (such as marking the big X of the position in the house) associated with a house come virtual The house is labeled as having checked by ground.The soldier can indicate that only other US soldiers can receive location-based content.When When other US soldiers pass through the house, they can be such as by seeing virtual " X " on house side (if they have mesh Mirror or some other enable augmented realities equipment) or by receiving indicate message that the house has been checked and automatically connect Receive instruction.In another example, content (warning, target identification, communication etc.) related with security application can be streamed To eyepiece.
Each embodiment can provide the mode for checking information associated with (such as in shop) product.Information may include The nutritional information of food product, the introduction on discharge of dress-goods, consumer electronics technology explanation, electronic coupons, rush Pin, compared with the price of other similar product, compared with the price in other shops etc..This information can be relevant to shop arrangement etc. The peripheral visual field of wearer is projected onto the relative position of product.Product can be by SKU, brand label etc. by vision terrestrial reference Know;It is transmitted by product packaging, such as passes through the RFID label tag on product;It is transmitted by shop, is such as existed according to wearer Position in shop, relative to position of product etc..
For example, viewer can just walk through clothes shop, and can be provided that when they walk about the clothing on shelf The information of clothes, wherein the information is provided by the RFID label tag of product.In embodiments, which can be used as information column Table, diagram, audio and/or representation of video shot etc. are delivered.In another example, wearer can buy food, and advertisement supplies Answer mechanism that can provide information associated with the neighbouring product of the wearer to wearer, when wearer selects the product and checks It can be provided that the information whens brand, name of product, SKU etc..In this way, can provide to wearer a kind of can have wherein The environment with more information of effect ground shopping.
One embodiment allows user (to be such as mounted on exemplary sun eye by using the equipment for enabling augmented reality Camera lens in the eyepiece of mirror) it receives or shared about shopping or the information in urban district.These embodiments will use enhancing existing Real (AR) software application, such as those above applications referred in conjunction with the supplier of search cargo and service.In a scene In, the wearer of eyepiece can take a walk in street or market for shopping purpose.In addition, user can activate and various help to be directed to Special scenes or environment define the mode of user preference.For example, user can enter navigation mode, worn by the navigation mode Person, which can be navigated to pass across a street with market, buys the accessory and product of preference.The mode can be selected and wearer can pass through Various methods (passing through Text Command, voice command etc.) provide various guidances.In one embodiment, wearer can provide Voice command selects navigation mode, this can lead to the enhancing before wearer and show.Enhancement information can describe in market It is the offer that is provided in the related information in each shop and the position of supplier, each shop and by each supplier, current It reduces the price time, current date and time etc..Various types of options can also be displayed to wearer.Wearer can roll each option And it is guided to take a walk on the street Di by navigation mode.According to provided option, wearer can be according to such as offer and folding Button etc. selects to be most suitable for the place of his shopping.In embodiments, eyepiece can provide search, browsing, selection, preservation, sharing Buy the ability (such as checking by eyepiece) of article, the advertisement for receiving purchase article etc..For example, wearer can be in internet One article of upper search simultaneously (passes through application shop, E-business applications etc.) in the case where not making a phone call and is bought.
Wearer can provide voice command come to the place navigation and wearer be then directed at this.Wearer may be used also Automatically or according to about current transaction, promotion and the request of event in interested position (such as near shopping shop) connect Receive advertisement and offer.Advertisement, preferential and offer can neighbouring wearer occur and option can be shown to be used according to advertisement, excellent Favour and offer buy desired product.Wearer can for example select a product and buy it by Google checkout.It is similar As describing in Fig. 7, the information that the transaction of message or Email and purchase product has been completed may alternatively appear in eyepiece On.Product sending state/information can also be shown.Wearer can further by social network-i i-platform convey or remind friend and Relative can also request them to be added about offer and event.
In embodiments, the wearable wear-type eyepiece of user, wherein eyepiece includes optics assembly, passes through the optics group Piece installing user can check the environment of surrounding and the content of display.Shown content may include one or more local advertisings.Mesh The position of mirror can be determined by integrated position sensor and local advertising can have the correlation with the position of eyepiece.Make For an example, the position of user can by GPS, RFID, be manually entered etc. and to determine.In addition, user, which can walk, passes through coffee Shop, and according to the proximity of user and the shop, similar to such as fast food restaurant the display described in Figure 19 shop brand 1900(or The brand of coffee) advertisement may alternatively appear in the visual field of user.When user is when ambient enviroment range is mobile, he or she can be experienced The local advertising of similar type.
In other embodiments, eyepiece may include the capacitive sensor that can be sensed eyepiece and whether just contact with human skin Device.The sensor can be capacitive sensor, resistance sensor, inductive pick-up, emf sensor etc..Such biography Sensor or sensor group can be placed on eyepiece in the way of allowing to detect glasses and when being worn by the user and/or eyepiece mirror holder On.In other embodiments, sensor can be used to determine whether eyepiece is in a position, so that sensor can be for example in mesh Mirror can be worn by the user when being in the position of expansion.In addition, local advertising can only when eyepiece is contacted with human skin, can wear When wearing position, both combination, sent when being actually worn by the user etc..In other embodiments, local advertising may be in response to Eyepiece, which is turned on or is turned on and is worn by the user in response to eyepiece etc., to be sent.As an example, advertiser may be selected only when The facility of user's adjacent specific and when the practical positive wearing spectacles of user and local advertising is sent when glasses are turned on, thus Allow advertiser that advertisement is directed to user in reasonable time.
According to other embodiments, local advertising can be used as banner, two-dimensional diagram, text etc. to be shown to user.This Outside, local advertising can be associated with the physics aspect in the user visual field of ambient enviroment.It is existing that local advertising also can be displayed as enhancing Real advertisement, wherein advertisement is associated with the physics aspect of ambient enviroment.Such advertisement can be two-dimentional or three-dimensional.As showing Example, local advertising can (as further described in Figure 18) associated with physics notice board, wherein the attention of user can The content of display is fallen in, the content of the display shows the drink for the real building object being poured into ambient enviroment from notice board 1800 Material.Local advertising may also include the sound that user is shown to by earphone, audio frequency apparatus or other means.In addition, local advertising It can be animated in embodiments.It is flowed to from notice board on close building and optionally for example, beverage can be seen in user It flows in the environment of surrounding.Similarly, advertisement can show the movement for any other type wanted such as in advertisement.Additionally, Local advertising can be displayed as three dimensional object, which can be associated with ambient enviroment or be handed over ambient enviroment Mutually.In the associated embodiment of object in the user visual field of the wherein advertisement with ambient enviroment, even if when user rotates his When head, advertisement can also keep associated with the object or neighbouring object.For example, if an advertisement is (described in such as Figure 19 Coffee cup) it is associated with specific building, even when user rotate his head go the another pair checked in its environment as When, coffee cup advertisement can keep associated with the building and on building position appropriate.
In other embodiments, local advertising can be shown to user according to the Web search that user carries out, wherein advertisement quilt It is shown in the content of web search result.For example, user can search for " reducing the price the time " when he just takes a walk in the street, and In the content of search result, the local advertising for the beer price advertisement in local bar can be shown.
In addition, the content of local advertising can be determined according to the personal information of user.The information at family can be used to answer web It is available with, advertising organizations etc..In addition, the eyepiece of web application, advertising organizations or user can be filtered according to the personal information of user Advertisement.In general, for example, user can store what is liked about him and does not like what personal information, and it is such Information can be used for the eyepiece that advertisement is directed to user.As specific example, user can be stored about him to local motion The data of the hobby of team, and when advertisement is made available by, those advertisements in relation to the team, fortune team that he likes can be given preferentially And it is pushed to user.Similarly, the things that user does not like can be used for excluding specific advertisement from the visual field.In each implementation In example, advertisement can be cached on the server, and advertisement can be by advertising organizations, web application and eyepiece on that server At least one access and be displayed to user.
In embodiments, user can be interacted with various ways and any kind of local advertising.User can pass through The movement of at least one of mobile eyes, body movement and other postures is made to request additional letter related with local advertising Breath.For example, if an advertisement is displayed to user, he can be in his visual field in his hand of advertisement Back stroke or by his eyes It is moved in the advertisement and receives to select specific advertisement about the more information of the advertisement.In addition, user may be selected to lead to Any movement being described herein or control technology (, body movement mobile by eyes, other postures etc.) are crossed to neglect Slightly advertisement.In addition, user can allow this wide by not selecting advertisement to carry out further interaction within given a period of time Announcement is ignored by default to select to ignore the advertisement.For example, if user's selection does not make posture in five seconds that advertisement is shown To obtain more information from advertisement, then the advertisement can acquiescently be ignored and be disappeared from the user visual field.In addition, user can Selection does not allow local advertising shown, this can be selected on a graphical user interface such option or be passed through by the user Such feature is closed via the control on the eyepiece to realize.
In other embodiments, eyepiece may include audio frequency apparatus.Therefore, shown content may include local advertising and Audio, so that user can also hear message or other sound effects (since they are related with local advertising).As example And referring again to Figure 18, when user sees the beer being poured out, he can will actually hear and respectively act phase in advertisement Corresponding audio transmission.In this case, user can hear bottle cap open and poured followed by liquid from bottle and The sound being poured on roof.In other embodiments, descriptive message can be played or general information can be by as wide A part of announcement provides, or both.In embodiments, any audio can be played as expected for advertisement.
According to another embodiment, social networking (can be such as provided in eyepiece by using the equipment for enabling augmented reality Camera lens) promote.This can be used for several users that will do not have the equipment for enabling augmented reality or other people connect Be connected together, these several users or other people can share idea and viewpoint each other.For example, the wearer of eyepiece can be with other Student is sitting in campus together.Wearer can connect with first student for appearing in cafe and be sent to it message. Wearer can inquire that first student is related to specific subject (for example Environmental Economics) interested people.With Other students pass through the visual field of wearer, configuring camera lens inside eyepiece can be traced student and by student and networking number According to library (such as may include open profile " Google me(carries out Google search to me) ") it matches.From open number Can occur in the front of wearer according to the profile of the interested and related personnel in library and pop up on eyepiece.It may not in profile Relevant some profiles can be shielded or be revealed as being shielded to user.Associated profiles can be highlighted for wearer's Quick Reference.By wearer selection associated profiles may interested in disciplinary environment economics and wearer can also be with them Connection.In addition, they can also also connect with first student.In this way, increasing can be enabled by wearer's use The eyepieces of strong real-world characteristics constructs social networks.By wearer management the social networks and session therein can be saved with Reference for future.
The present invention, which can be used in, has used the equipment for enabling augmented reality (such as camera lens of the configuration in eyepiece) Real estate scene in.According to this embodiment, wearer may wish to obtain the information in relation to a place, and wherein user is in spy Fixed time (during driving, walking, jog etc.) may alternatively appear in the place.Wearer can for example want to know at this House benefit and loss in place.He may also go for the details in relation to the facility in the place.Therefore, it wears Wearer can utilize map (such as Google Online Map) and identify the real estate that there can be used for hiring out or buying.As mentioned above It arrives, mobile Internet application (such as Layar) can be used to receive about praedial information that is for sale or hiring out in user. In applying as one, the information about the building in the user visual field is projected on the inside of glasses so that user examines Consider.Each option can be shown to wearer on eyepiece lens for such as rolling using the tracking plate installed on spectacle-frame.It wears Wearer may be selected and receive the information in relation to selected option.The scene of the enabling augmented reality of selected option can be shown Show to wearer and wearer can check picture and carry out facility visit in virtual environment.Wearer can further receive The information of related real estate manager is simultaneously met with one in real estate manager.Email notification can also be received on eyepiece Or Advise By Wire is for reservation confirmation.If wearer has found that selected real estate is worth very much, transaction may achieve and wear Wearer is commercially available.
According to another embodiment, customize and support tourism and travelling can by using enable augmented reality equipment (such as The camera lens being provided in eyepiece) Lai Zengqiang.For example, wearer (as traveller) reaches a city (such as Paris) simultaneously Want to receive about this area in relation to visit and the information gone sightseeing, to arrange next a couple of days in the trip of his retention period It plays.Wearer can put on his eyepiece or the equipment of any other enabling augmented reality of operation and the language for providing the request about him Sound or Text Command.The eyepiece for enabling augmented reality can be positioned wearer position by geographical remote sensing technology and determine wearer Tourism favor.Eyepiece can be received according to the request of wearer and displaying format customization information on the screen.Customization travel information can wrap Include the information about following place: National Portrait Gallery and museum, monument and history place, do shopping comprehensive place, amusement and night life Place, restaurant and bar, most popular tourist famous-city and tour center/hot spot, most popular local/culture/area purpose Ground and hot spot etc., rather than limit.Selection according to user to one or more of these classifications, eyepiece can prompt the user with it Its problem, residence time, tourism cost etc..Wearer by voice command can respond and with wearer institute The order of selection receives customization travel information.For example, wearer can give National Portrait Gallery higher than monumental priority.Therefore, should Information becomes can be available to wearer.In addition, map and different travel option collection and different priority rankings can also go out The front of present wearer, such as:
1: the first travel option of priority ranking (avenue des champs elysees, Louvre Palace, Luo Dan, museum, famous coffee Shop)
2: the second option of priority ranking
Priority ranking 3: third option
For example, the first option is ordered as having highest priority, wearer due to the preference indicated according to wearer The first option may be selected.Advertisement related with sponsor can pop up immediately after selection.Then, virtual tourism can be by very close In the augmented reality mode of true environment.Wearer can for example proceed to spending a holiday for the Atlantis holiday village in Bahamas It goes sight-seeing within dedicated 30 seconds.Virtual 3D visit may include the fast browsing to room, sandy beach, public space, park, facility etc..It wears Wearer can also experience the shopping facilities in the region and receive offer and discount in these places and shop.Terminate in this day When, wearer, which may be sitting in his room or hotel, has experienced daylong visit.Finally, therefore wearer can determine With his plan of arrangement.
Another embodiment can be come by using the equipment (being such as provided in the camera lens in eyepiece) for enabling augmented reality Allow to pay close attention to the information of motor vehicle repair and safeguard service.Wearer can be related to receive for the voice command requested by sending In the advertisement of auto repair shop and dealer.The request for example may include the demand of changing oil in vehicle/car.Eyepiece can be from repairing Shop receives information and is shown to wearer.The 3D model of the vehicle of wearer can be stopped and be passed through by eyepiece enables augmented reality Scene/view shows remaining oil mass in car.Eyepiece can show other also relevant informations about the vehicle of wearer, all Such as the maintenance needs in other positions (such as brake block).Wearer can be seen the 3D view of the brake block of abrasion and may be to general These brake blocks place under repair or replace interested.Therefore, wearer can be come by using the integrated wireless communications ability of eyepiece The reservation with manufacturer is arranged to solve the problems, such as this.It can be reminded by Email on eyepiece camera lens or Inbound Calls to connect Receive confirmation.
According to another embodiment, purchase present (can be such as provided in eyepiece by using the equipment for enabling augmented reality Camera lens) Lai Shouyi.Wearer can announce the present request for certain situation by text or voice command.Eyepiece can Prompt wearer answers his preference, present type, age group, the cost range of present of the people that receive the present etc.. Various options can be presented to user according to the preference received.For example, the option for being presented to wearer may is that cookies Basket, grape wine and assorted, the present basket of golfer of cheese basket, chocolate etc..
Available options can be rolled by wearer and can select most suitable option by voice command or Text Command.Example Such as, the present basket of golfer may be selected in wearer.The present basket of golfer and the 3D view of golf course can It appears in front of wearer.The present basket of golfer that is enabled by enhancing display and golf course it is virtual 3D view is in close proximity to real world perceptually.Wearer can eventually by eyepiece to the address of prompt, position and Other similar inquiries respond.Then it can be reminded by Email on eyepiece camera lens or Inbound Calls to receive really Recognize.
In embodiments, eyewear platform in combination with various controlling mechanisms come using and take physics and information-based input, It executes on processing function and control panel and surface and system (including being based on feedback loop), to interact and hold with content Row e-commerce transaction.The e-commerce of even now and content scene be it is a large amount of, but some such scenes include but It is not limited to retail shopping environments, educational environment, transportation environment, home environment, event context, catering environment and outdoor environment. Although there is described herein these fields, various other scenes be will be apparent to those skilled in the art.
In embodiments, eyewear platform can be used in retail shopping environments.For example, glasses can be used to receive in user Content related with interested article and/or environment.User can receive and/or search the pricing information in retail shopping environments Or substitution offer, product information (SKU/ bar code), scoring, advertisement, GroupOn(purchase by group) offer etc..In each embodiment In, user can find or obtain the location information for special article.User also can get about with particular brand, article and/ Or the information of the related integral plan information of shopping environment.In addition, user can be used equipped with camera, scanner, QR reader Deng glasses article is scanned into shopping basket.In addition, eyepiece can be used to detect article best in one group of article in user. As an example, user can the features of active spectacles with specific mode visualize article, such as determine or feel using program It is best in a branch of to find to survey density or the thickness of an article.In embodiments, glasses can be used to negotiate article in user Price or provide the price of his preference.For example, after virtually or using scanner associated with glasses etc. scanning article, User can make posture, it is mobile he eyes or using voice command or use other way that will pay to provide him for the article Price.User can further use glasses, and to carry out order article scanned and then by showing or providing via user interface Method of payment is paid.Such payment can be indicated by hand gesture described herein, eyes movement etc..Similarly, it uses Family can such as by GroupOn during the trip of her shopping exchange " plot point " or reward, and receive with special article and/or The related promotion of facility.In addition, user can carry out image recognition using glasses, so that article is identified in an interface And it places an order to the article.For example, the program that glasses use allows user to use glasses to identify the wrist-watch in StoreFront, thus When the article is identified, triggering is directed to the menu that places an order of the article in the interface.It in other embodiments, can be by sweeping Bar code, QR code, Product labelling etc. are retouched to enter information into eyewear platform.Move back and forth in retail environment as user or When he is just using glasses to participate in retail interface, sales promotion information (processing, mark, advertisement, discount coupon etc.) can be swept by glasses It retouches or receives or identify in other ways.User can scan accumulating card with glasses to use in transaction or input in other ways Such information during retail transaction to use.In embodiments, glasses can assisting navigation and guide.For example, user can It is presented the detailed map in shop, and can be provided that channel Notation Of Content, so that user be allowed preferably to navigate to article and more It navigates in retail environment well.User can capture product image or downloading product image from true environment, so that the figure As can be used for buying article, the notes for creating the article, generation or the scoring, comment and product information that receive the article etc.. In addition, the application of the geographical location of object images and glasses allows, user receives the proximal most position of article, the local of article is commented on Deng.In embodiments, the geographical location of user allows specific object images to be generated or more suitably identified.
As more specific example, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes being used for Eyepiece is determined just adjacent to the module of retail environment, and system may include that the optics of retail environment around is checked by its user The feature of assembly, for identification environment and 3- of the rendering for the ad content of the retail location of eyepiece on wear-type eyepiece The 3-D processing module that D is shown, the wearer for capturing and handling wear-type eyepiece environment image image-processing mould Block, the integrated image source for content to be introduced into optics assembly, wherein integrated image source, which shows 3-D, is rendered into environmentally Covering, and the 3-D that wherein related with retail environment ad content is presented in integrated image source is shown.In embodiments, Display elements can be locked in the identificated feature of environment by 3-D processing module, and content is available for being known in display The relationship of another characteristic is presented.In embodiments, the rendering that the 3-D of advertisement is shown can be scanning bar code, QR code, produce The result of at least one of product label etc..This is also possible to achievement below: purchase product, in eyepiece input product figure As, into retail environment position (or the position for moving into retail environment), by the eyes of user be fixed to product on and The input integral plan information in eyepiece.In embodiments, the 2nd 3-D shows that the result that can be used as at least one of carrys out quilt Rendering: scanning product, purchase product, into position of retail environment etc..In embodiments, user can be executed by eyepiece E-commerce transaction, and the transaction may include scanning article come buy, according to compared with other articles come select an article, Negotiated price, exchange plot point, exchange promotion, order item etc..In embodiments, advertisement may include the article close to user Position.The position of article can be shown relative to the position of user, and user can be administered to the guiding of the article.In each reality It applies in example, eyepiece can be used for social networking and eyepiece can use the face recognition of another user in retail environment.In addition, Eyepiece can be used for identifying the presence of a people in the environment, and presentation and wearer and the relationship known between others are related Social networking content.In addition, user can send and/or receive friend request by making posture with a position of his body. In embodiments, user can be by advertisement come the price of comparative item.In embodiments, advertisement may include audio content. In embodiments, one feature of identification may include at least one of: to the automatic processing of the image comprising this feature, with letter It number this feature is checked, communicated with this feature, identify this feature by handling the position of this feature, from database retrieval Specified about the information of this feature, the user of this feature etc..In addition, user can be specified by the user interface interaction with eyepiece One feature is for keeping overlay content.In embodiments, covering can be in the identificated feature or neighbouring the identificated feature be in Existing content, and in a further embodiment, the identificated feature can be at least one of: the article to be bought, for sale Article, mark, advertisement, channel, the position in shop, retail kiosk, information desk, cash register, television set, screen, shopping cart etc..
In embodiments, glasses can be used in educational environment.As an example, glasses can show e-learning content, Such as in textbook or in other ways in find.Glasses allow user to check, cultivation, review project is with for testting.? In each embodiment, user can be monitored when testting.Glasses can carry out timing to him when user leaies through material and can be with The response of track user necessarily adjusts examination with the process according to the answer of user and/or test.In further embodiment In, user can check that augmented reality (AR) is covered by glasses.In embodiments, AR covering may include in laboratory, In the medium gradually guidance in classroom.In embodiments, virtual professor can be shown, to allow through video, audio and talk It interacts.User can check blackboard/blank notes by glasses and he can onboard input additional item, these are additional Item can when other users check blackboard/blank in the user interface or when other users check true plate and they into Row is shared, so that AR notes can be added and/or cover when user checks specific blackboard/blank.In embodiments, Glasses can provide social networking platform to the member of class or educational class, and provide the social activity being directed to and about class member and join Net content.
In embodiments, glasses are used in combination with the business in educational environment.As an example, glasses can be used in user To buy course or otherwise tracking content progress and course credit.In addition, user can monitor examination and test rank with And upcoming examination and test administrative date, user can download course credit/class information, user can capture on classroom Identical content is simultaneously added in calendar by operation that is discussing, listing on syllabus or otherwise obtaining, and And user can meet with friend or class member by way of communicating via glasses with other people.In embodiments, User can check his bill and tuition fee report for checking and tracking them.In embodiments, user can purchase course Do same thing or course can be used in he, wherein course provides advertisement associated there.
In a further embodiment, user can use glasses in educational environment.User can be examined by glasses to scan Examination/test paper is for checking, operate.User can be scanned or otherwise capture and textbook content, handbook and/or work Book, blackboard/associated data of blank content are for recording the note and job trace.User can be scanned or capture and poster/mark Related data.Upcoming student's meeting, increase in inventory description, meeting-place etc. can be traced in user as a result,.In each implementation In example, user can capture the face of classmate, friend, concern personage etc..In embodiments, the eye of user can be traced in glasses Eyeball is mobile to verify the interaction with content.In embodiments, glasses allow " Lifestride(advances with big strides) " or other Function is to absorb content etc..User can make the pen taken down notes and communicated by movement with glasses with posture, and glasses can be deposited Store up the notes of user.In other embodiments, user can make posture and glasses can record notes according to such posture, And in a further embodiment, another sensor associated with user hand allows to clock in user's lettering pen, takes down notes quilt Glasses record.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes for determining Eyepiece is just adjacent to the module of educational environment.In addition, system can include: check that the optics of ambient enviroment assembles by its user Part;The processing module of the feature of environment and rendering education related content related with environment for identification;For capturing and handling Image-processing module of the image of the environment of the wearer of wear-type eyepiece, the image processing module can lock display elements It is scheduled in the identificated feature of environment;And the integrated image source for content to be introduced into optics assembly, wherein integrated figure Image source is rendered into covering environmentally for related content is educated, and wherein the content is available opposite with the identificated feature in display Relationship present.In embodiments, the display of transport content related with transportation environment can be presented in integrated image source, and Such relationship with the identificated feature can not be presented.In embodiments, the rendering of education content can be scanning bar shaped Code, QR code etc. as a result, it can be result below: in eyepiece input textbook image, in eyepiece input distribute material Image, the label in environment-identification and the position for entering educational environment of material.In embodiments, educational environment can be religion Room, gymnasium, automatic garage, garage, outdoor environment, gymnasium, laboratory, factory, business site, kitchen, hospital etc..This Outside, education content can be text, textbook extracts, instruction, video, audio, laboratory agreement, chemical structure, 3-D image, 3-D Covering, text covering, classroom book, test, prescription, classroom notes, case history, client file, safety instruction and daily forging Refining.In embodiments, education content can be associated with the object in environment or be covered on the object.In each embodiment In, object can be blank, blackboard, machine, automobile, aircraft, patient, textbook, projector etc..In embodiments, system It can be used for social networking, and further can be used at least one face recognition of classmate, teacher etc. in environment.Each In embodiment, user can send and/or receive friend request by making posture with a position of his body.In each embodiment In, user can be interacted with content for taking an examination, fulfiling assignment, checking syllabus, check course project, practice Technical ability, tracking credit, records the note, records notes, submits a question at tracking course process.In embodiments, covering can be in institute Know in another characteristic or adjacent to known another characteristic presentation content.In addition, known another characteristic can be at least one of: sea Report, blackboard, blank, screen, machine, automobile, aircraft, patient, textbook, projector, monitor, desk, intelligent plate etc..As Example, notes may alternatively appear in by blackboard frame at display on;Film may alternatively appear in the display of screen whereabouts, and molecule is shown Can occur on blackboard etc..In embodiments, one feature of identification may include at least one of: to the image comprising this feature Automatic processing, with signal check this feature, communicated with this feature, identify this by handling the position of this feature Feature retrieves the information about this feature, the information from database retrieval about this feature by handling the position of this feature And the user of this feature is specified.In addition, user can specify a feature to be used to protect by the user interface interaction with eyepiece Hold overlay content.
In embodiments, glasses can be used in transportation environment.As an example, user can retrieve or capture to be had with transport The content of pass, such as timetable, availability, delay and cancellation.For example, he can be checked by glasses when user arrives at the airport His Flight Information simultaneously sees that his flight will still postpone on schedule.In embodiments, user can check his seat/freight space Select and select dessert and diet preference.He he can be exchanged or more by glasses by glasses check-in and in embodiments Newly his seat selection.In embodiments, user pilot can be given the gradually stream of inventory before the flight required for FAA Journey.In embodiments, guidance and navigation whens train train chief, pilot etc. can be given operation train, aircraft etc. indicate. In other embodiments, user passenger can look back security information by glasses, for example, user refers to safely before can checking flight Show, how wherein he is shown operates emergency equipment etc..In embodiments, glasses can be used to carry out predetermined supporting item for user, Rent car, hotel etc..In embodiments, user can make a reservation for the visit to appear in person and/or he can be by glasses pair Interested region carries out virtual tours.He can check him by the ambient enviroment for the destination travelled, so that he is reaching The region is familiar with before.In other embodiments, he can also check and the processing of more various items.User can check and/or Receive integral content, such as to the available reward of particular account, he it is convertible integral and to what project, he is convertible Deng.User can be used to exchange plot point at assigned aircraft, rental car, hotel etc. by glasses.In embodiments, Yong Huke Using glasses come the purpose for networking in travelling or transportation environment.For example, user can find out in his specific flight or fire Whom people on vehicle is.Also glasses can be used to check entertainment content during transportation for user.As an example, film in-flight can It is transferred to the glasses of user.In embodiments, user can check content related with each position and he can check AR Continental embankment etc..As an example, user can check such as associated with specific region interested when train or aircraft pass through a landscape The AR of project covers (such as continental embankment).In embodiments, it is received when user can pass through notice board/mark in his transfer wide It accuses.In addition, user can receive personal information related with the transport professional person for participating in its transport.As an example, user can connect Receiving the record of the driver from the point of view of related with taxi driver information or he can check and can reflect scoring safely for pilot Pilot accident and/or act of violating regulations record.
In addition, user can use relatedly with business glasses in transportation environment.For example, glasses can be used to make a reservation in user Seat, exchange reward plot point carry out Reserved seating, are arranged and paid for the diet during transport.User can find flight simultaneously It carries out making a reservation for/paying, rent car, predetermined hotel, taxi, bus etc. thus.User can be related with the travelling with him People (such as other passengers) networks.In embodiments, glasses can be used to navigate in user.For example, user can be given The map of bus and taxi is to be shown in the best route to get around in city and method.User can be for identical purpose To application payment and/or in order to which identical purpose is checked and applies associated advertisement.User can during his travelling with continental embankment AR content around interior and continental embankment interacts, and with from notice board, mark based on AR etc. advertisement and promotion carry out Interaction.
In embodiments, project can be input to eyewear platform in transportation environment by user.For example, he can be by using Glasses scan his ticket to start check-in process.He can be provided that instrument board, which shows speed during his transport Degree, fuel and GPS location.Glasses can be communicated by bluetooth with the IT system of vehicle with display instrument dial plate and provide about The information of vehicle and/or Transportation Model.In embodiments, user glasses can be used identify other passengers face and/or The image of other passengers is stored by inputting an image into glasses.User continental embankment related content can be input in glasses with For interacting or creating the database of the content for recalling in the future.In embodiments, user can input can based on AR's or Can be not based on notice board/mark of AR for their storage and with their interaction.
In addition, system may include the interaction wear-type eyepiece that user wears, wherein eyepiece includes: for determining that eyepiece is just adjacent It is bordering on the module of transportation environment;The optics assembly of ambient enviroment is checked by its user;The feature of environment and wash with watercolours for identification Contaminate the processing module of transport related content related with transportation environment;For capturing and handling the ring of the wearer of wear-type eyepiece Display elements can be locked in the identificated feature of environment by image-processing module of the image in border, the image processing module; And the integrated image source for content to be introduced into optics assembly, wherein integrated image source is rendered into environment for content is transported On covering, and wherein the content can be presented with the opposite relationship with the identificated feature in display.In embodiments, collect The display of transport content related with transportation environment can be presented at image source and such and the identificated feature relationship can not It is presented.In embodiments, the rendering for transporting content can be result below: scanning bar code, QR code, ticket etc., input The image of ticket for transport, into train, railway station, taxi station, taxi, airport, aircraft, ship, platform, Iron, subway station etc..In embodiments, transport content can be that text, video, audio, 3-D image, 3-D covering, text covers Lid, guide, arrangement, map, navigation, advertisement, the position of point of interest, auxiliary resources, safety instruction, flight instruction, operator are clear List, FAA information, Flight Information, arrival and Departure airport information, itinerary etc..In embodiments, auxiliary resources may include Make a reservation for for making hotel, make car rental and make a reservation for, make that dinner is predetermined, marks personal preference, the selection of change seat, discovery Local amusement, the resource for arranging locally visit etc..In embodiments, user eyepiece can be used buy for flight, by ship, Through ticket by train, user can purchase the set ticket taken for subway, check schedule, compare travel price, retrieval side To, retrieve transit route, consult a map in current location, check the high usage route etc. for Transportation Model.In each embodiment In, content can be associated with vehicle for showing the information about the vehicle, wherein such information includes emergency exit Information, maintenance information, operation information, instrument board information, type information etc..System can be used for social networking, and the system The face recognition to traveller, operator in environment etc. can be used.User can be by making posture with his position of body To send and/or receive friend request.In embodiments, eyepiece can be used for identifying the presence of a people in the environment, and be in Now with wearer and know the related social networking content of relationship between others.In embodiments, user can with show The advertisement shown is interacted to obtain additional information.In embodiments, content can be enhancing environment (and content can be with Environment is enhanced), including following arbitrary content: visually indicate, audio instruction, visual indicia, for various reasons (including For in case of emergency escaping from the environment etc.) covering route planning.In embodiments, covering can be in known another characteristic Upper or neighbouring known another characteristic presentation content.In addition, known another characteristic can be at least one of: poster, train fly Machine, taxi, ship, subway, screen, retail kiosk, map, window and wall.In embodiments, identification one feature may include with It is at least one lower: to check this feature to the automatic processing of the image comprising this feature, with signal, lead to this feature Letter, identify by handling the position of this feature this feature, from database retrieval about the information of this feature, the user of this feature It specifies.In addition, user can specify a feature to be used to keep overlay content by the user interface interaction with eyepiece.
In embodiments, eyewear platform can be used in home environment.In embodiments, glasses make in combination with content With.For example, glasses can be used for entertaining, wherein user, which is in, watches media.It is miscellaneous for example to generate that glasses may be additionally used for shopping Goods inventory etc., and carry out the article needed for inventory and stored and be used to check these contents.Glasses can be used to carry out in user Family is coordinated, by via glasses come payment bill, the production task list to be completed of family etc..For example, glasses can quilt Reservation, upcoming football match etc. for making and retaining doctor.Glasses can be used for program-guide.As an example, Glasses can be used for that user is instructed to come controlling electric device equipment, DVD player, VCR, remote controler etc..In addition, glasses can be used for protecting Peace and/or safety.User can activate alarm system to ensure what it was open when being in or when being away from home.User can leave When check family's camera and open family lamp and close family lamp etc..User can be given the finger for emergency Show, for example, user can be given about the instruction how done during kindling, hurricane etc..The openable eye described herein of user The feature of mirror understands thoroughly smog etc..During this emergency, user glasses can be used track kinsfolk and and they It is communicated.Glasses can provide help CPR guidance, 911 callings etc..
In embodiments, glasses are used in combination with the business in home environment.For example, glasses can be used to order in user Purchase food delivery, check dry-cleaning, subscribe dry-cleaning pickup etc..User can order entertainment content, film, video-game etc..? In each embodiment, user can find and using for family's project, guiding material of payment bill etc..User can look into when being in It sees advertisement and/or promotion and takes action according to them.For example, if when just when kitchen is using blender, advertisement is displayed on user In glasses, advertisement can prompt the user with discovery more about the information of new blender and user can choose the advertisement to learn More information about the equipment.
In embodiments, user can use glasses in family's sublimity, so that user enters information into glasses. As an example, user can input secretarial work for storage, recall, interaction etc..User can input Shopping List, bill, clear List, handbook, mail etc..User can input the advertisement from the paper email advertisement for enablings such as AR, TV, radio.User can Paper advertisement is scanned to check or receive additional AR information associated with advertisement.User can input embedded symbol and/or Identifier for example identifies electrical equipment or other hardware.User can input Wi-Fi network content to glasses.In addition, user can Input television content, such as screen and smart television content.User can be carried out by eyewear platform and such content as a result, Interaction.Remote control command can be input in eyewear platform by user, so that user will operate various equipment, it is such as electric Depending on, VCR, DVD player, electrical equipment etc..In addition, user can input security system content, enables a user to and ensure public security System, camera associated with security system etc. are interacted and are controlled them.User can check and security system phase Associated various camera feeds, so that he can check each region around home environment by eyewear platform.Glasses can be through By bluetooth, via internet, Wi-Fi connection etc. connect with such camera.User can be further able to setting alarm, close police It reports, check alarm and is interacted with alarm associated with security system.
In addition, system may include the interaction wear-type eyepiece that user wears, wherein eyepiece includes: for determining that eyepiece is just adjacent The module of nearly home environment;The optics assembly of home environment around is checked by its user;The feature of environment is simultaneously for identification Render the processing module of family related with environment related content;For capturing and handling the environment of the wearer of wear-type eyepiece Image image-processing module, which can be locked in display elements in the identificated feature of environment;With And the integrated image source for content to be introduced into optics assembly, wherein family's related content can be rendered by integrated image source Covering environmentally, and wherein the content can be presented with the opposite relationship with another characteristic known in display.In each implementation In example, the display of content related with environment and such and relationship and content of the identificated feature can be presented in integrated image source It can not be presented.In embodiments, the rendering of content can be result below: be fixed on into family, by the eyes of user On an article in family, in eyepiece environment-identification label, operate another equipment etc. of family.In embodiments, content May include the user interface for operating such as following equipment: VCR, DVR, satellite receiver, set-top box, video on demand equipment, Audio frequency apparatus, video game console, warning system, home computer, heating and cooling system etc..In embodiments, it uses Family can by eyes movements, hand gesture, point is first-class interacts with user interface.In embodiments, content allows User completes following task: generating shopping list, checks groceries inventory, payment bill, checks bill, activation equipment, operation lamp Light generates for kinsfolk and/or other people virtual communication, orders delivery service (dry-cleaning, food etc.), by environment In advertisement action etc..In embodiments, user can identify another people just in home environment or family by eyepiece Face.In embodiments, content may include that instruction and the instruction in urgent setting can be audio, video, video At least one of instruction etc..In embodiments, content can be enhancing environment or content and can increase to environment By force, and including following arbitrary content: visually indicate, audio instruction, visual indicia, in case of emergency escaping from environment etc. Cover route planning.In embodiments, content may be in response to embedded symbol, television audio and/or video content, advertisement etc. To generate.In embodiments, content can be retrieved from the user's manual being stored in eyepiece or from internet downloading etc..Content It may include 3-D advertisement, audio, video, text etc..In embodiments, one feature of identification may include at least one of: to packet The automatic processing of image containing this feature checks this feature with signal, is communicated with this feature, by processing this feature Position come identify this feature, it is specified etc. about the user of the information of this feature, this feature from database retrieval.In each embodiment In, user can specify a feature to be used to keep overlay content by the user interface interaction with eyepiece.In embodiments, Covering can be in known another characteristic or adjacent to known another characteristic presentation content.In addition, known another characteristic can be it is following At least one: electrical equipment, notes write station, note pad, calendar, wall, electronic equipment, security system, room, door, gateway, Key holder and fixed device.
In embodiments, user can use glasses in event context.In various event contexts, eye is can be used in user Mirror platform is interacted with content.As an example, user, which can check, is directed to such as concert, ball match, various entertainments, quotient Timetable, ticket information and/or the ticket/seat availability of the events such as industry event.User can check or in other ways with one The sales promotion information of event interacts.User can check integral program content, points such as associated with an event or reward Value etc..User can be provided in association the access for an event due to integral plan etc. or with integral plan etc..With Family can be provided the chance of " premiums " material for an event of checking due to integrating state etc..In embodiments, it uses Family can check ancillary service associated with event and commodity, buy these services and commodity etc..In embodiments, Yong Huke Check the AR content at event, first down line, goal mark, to racer/performing artist access etc..In each implementation In example, user can check optional video feed, and the side visual angle, backstage when another location such as user in stadium regard Angle/video feed etc..
In embodiments, glasses are used in combination with the business in event context.As an example, user is commercially available/pre- Booking, check selection/available seat etc..User can predetermined supporting item, such as purchase backstage pass upgrades his seat Position etc..In embodiments, user can purchase event dependent merchandise, sport shirt, concert clothing, poster etc..User can Plot point is further exchanged, such as those plot points associated with reward or frequent participant's project.In embodiments, Yong Huke Buy the specific part or complete of picture and/or picture with scenes as souvenir, the chronicle of events, such as from event, match or event The project etc. of the video for being digitized " signature " of whole match or event.The available more costs of user or view for free are in this way Event during the additional video or explanation of racer and/or performing artist.
In embodiments, project and/or data can be input to eyewear platform in event context by user.In each implementation In example, user can input ticket/pass to find his seat, log on to event etc..User can input promotion with AR enhancing Material (such as poster and mark) is for checking them and/or interact with them.User can input integral plan information simultaneously It can be for particular event scanning card etc..Such user can such account related with event interact, to the account Data, activation account etc. are provided.In embodiments, Web content can be input to glasses via Wi-Fi, bluetooth etc. by user In.
In addition, system may include the interaction wear-type eyepiece that user wears, wherein eyepiece includes: for determining that eyepiece is just adjacent The module of nearly event context;The optics assembly of event context around is checked by its user;For the wash with watercolours on wear-type eyepiece Processing module of the dye for the display of the event content of the event context of eyepiece;For capturing and handling the wearing of wear-type eyepiece Image-processing module of the image of the environment of person, the processing include identification feature related with event and the position for storing this feature It sets;And the integrated image source for content to be introduced into optics assembly, wherein event content is rendered by integrated image source The covering environmentally checked by eyepiece wearer is simultaneously associated with this feature by the content;The wherein presentation of integrated image source and ring The related content in border.In embodiments, display elements can be locked in the identificated feature of environment by image processing module, And content can be presented with the relationship of the identificated feature in opposite display.In embodiments, the rendering of content can be with At least one lower result: the eyes of user are fixed in the project at event, the spy in environment-identification by entry event environment Image etc. of the appearance, input of sign, the ticket for scanning user, one people of identification under event from event.In embodiments, content It may include enhancing vision feeding, including following arbitrary content: first down line, place mark line, the display of performing artist, performance The display of person's musical instrument, instant replay, enhancing view, live video, optional view, advertisement related with event, 3-D content, seat Position upgrade availability etc..In various embodiments, content may include enhancing audio feed, including following arbitrary content: racer Explain, explain audio, race sounds, enhancing performance sound, performing artist's comment, live audio etc..User can it is mobile by eyes, Hand gesture, point are at least one of first-class to be interacted with content.In embodiments, eyepiece can be used for identifying a people Presence under event, and can present and wearer and know the related social networking content of relationship between others.In addition, User can send and receive at least one of friend request by making posture (such as nodding) with his one position of body. System may include user interface come for purchase events project, image and event view and from event digital signature it is big At least one of thing note.In addition, content may be in response at least one of embedded symbol, television content, advertisement etc. next life At.In embodiments, content may include backstage, cloakroom, baseball player lobby, baseball bull pen, racer At least one of the enhancing video of bench etc. and audio.In embodiments, one feature of identification may include at least one of: This feature is checked to the automatic processing of the image comprising this feature, with signal, is communicated with this feature, is somebody's turn to do by processing The position of feature come identify this feature, it is specified etc. about the user of the information of this feature, this feature from database retrieval.In addition, User can specify a feature to be used to keep overlay content by the user interface interaction with eyepiece.In embodiments, it covers Lid can be in known another characteristic or adjacent to known another characteristic presentation content, and in embodiments, known another characteristic Can be an object of match, including at least one of: place, ball, goal, scoreboard, extra large screen, screen, ball are passed through Distance, path, the stadium seat of ball etc..In embodiments, known another characteristic can be a pair of artist's performance As, including at least one of: musician, Musical Instrument, stage, music easel, performer, setting, stage property, curtain etc..In each reality It applies in example, known another characteristic can be an object in leased territory, including at least one of: doll, animal stuffed toy, sound Happy meeting clothing, food, beverage, cap, clothing, sandy beach towel, toy, sport souvenir, concert souvenir etc..
In embodiments, eyewear platform can be used in catering environment.For example, glasses can be in catering environment with regard to content To file a request.In embodiments, glasses can be used to make the making a reservation for of seat etc., check that possible seat is available in user Property checks scoring, comment, party venue position and content etc..User can also check menu content and price, the party venue and Comparison between other party venues, about food and beverage details (such as comment, nutritional ingredient, it how quasi- be It is standby etc.), the collocation of wine of the scoring of wine, automation etc..User can check social content, such as can recognize or identify a people And/or it is interacted with the client of identical party venue.In embodiments, user can check and the account of user and/or spy Determine the related integral program content of party venue, points of such as having dinner.Glasses can be used to translate the item on menu, pass through in user It searches for search the title of ingredient and definition etc..User can check the video or image of menu item.In embodiments, Yong Huke Check the AR version of menu.In embodiments, user can capture menu image and check the image with infinite focal, increase Magnifying power adjusts contrast, to menu illumination etc..In embodiments, user can check menu item and just scoring and price etc. Automatically drinks and beverage part are arranged in pairs or groups.User may have access to what he had eaten and he likes what database And check the prompting of passing meal.In embodiments, user can be seen that the item different from the item that he is consuming.For example, such as Fruit user has selected small pieces salad, this can be considered as filet steak etc. by he.
In embodiments, glasses can be used in the business of catering environment.For example, glasses can be used to find party ground Point makes or more new subscription, browses menu, is selected from the menu interested item or the item to be bought and in party venue Options is come.Glasses can be used for being paid for one, shared payment, calculate tip, exchange plot point etc..
In embodiments, glasses can be used in catering environment with input data/item.In embodiments, Yong Huke Carry out input content via Wi-Fi, bluetooth etc..In embodiments, user can input menu, mark with AR enhancing etc. to look into It sees them and/or is interacted with them.In embodiments, user can input the ad content with AR enhancing to check them And/or it is interacted with them.User can input the item for payment, credit/debit card, integral payment/exchange etc..In this way Input can be made by near-field communication etc..In embodiments, user can pay via face recognition.In each embodiment In, glasses can be used for identifying the face of employee and such payment can file a request according to such face recognition.? In other embodiments, the face of the facial or another individual of user can be identified and account can be withholdd accordingly to make branch It pays.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes: for determining Eyepiece is just adjacent to the module of at least one of eating surroundings and wine culture;The optics assembling of ambient enviroment is checked by its user Part;The processing module of the feature of environment and rendering food and drink related content related with environment for identification;For capturing and handling Image-processing module of the image of the environment of the wearer of wear-type eyepiece, the image processing module can lock display elements It is scheduled in the identificated feature of environment;And the integrated image source for content to be introduced into optics assembly, wherein integrated figure At least one of food and drink related content can be rendered into covering environmentally by image source, and the content can be used relatively and in display The relationship of the identificated feature is presented.In embodiments, the display of content related with environment can be presented in integrated image source, and And such relationship and content with the identificated feature can not be presented.In embodiments, the rendering of content can be following At least one result: at least one of eating surroundings and wine culture, the eyes of user are fixed in this context Menu on, open that menu, the label in environment-identification, to focus mark in the environment first-class.In embodiments, content can Including enhance menu content, including at least one of: the scoring of menu, the comparison of menu item, menu item nutritive value, wine The video of the image of pairing, menu item, the audio description of menu item, menu item with menu item, enhancing magnifying power, menu item Contrast and illumination and according to geographic area, ingredient, the scoring of item, before user whether the menu item of post-consumer this etc. Classification.In embodiments, content can be received as menu at seats such as users.In embodiments, user can pass through eye Eyeball movement, hand gesture, point are at least one of first-class to be interacted with content.In embodiments, user can be via mesh Mirror is ordered.In embodiments, user can pay check, bill or expense via eyepiece.In embodiments, eyepiece can Be used for social networking and provide at least one of: comment of the user to environment and the face to people another in environment are known Not.In embodiments, user can be sent and received in friend request extremely by making posture with a position of his body It is one of few.Content may include the additional information related with menu item retrieved from internet.In embodiments, covering can be In known another characteristic or adjacent to known another characteristic presentation content.In embodiments, known another characteristic can be following At least one: poster, frame, Menu Board, menu, beverage container, food display vehicle, bar, desk, window, wall etc..Each In embodiment, one feature of identification may include at least one of: to the automatic processing of the image comprising this feature, with signal come Examination this feature, communicated with this feature, identified by handling the position of this feature this feature, from database retrieval about The information of this feature, the user of this feature are specified etc..In embodiments, user can by the user interface interaction with eyepiece come A specified feature is for keeping overlay content.
In embodiments, eyewear platform can be used in outdoor environment.In embodiments, glasses can be used for it is interior Hold interaction and/or checks content.User can check navigation information, such as track position, to destination time, arrive track or edge The AR covering of the track barrier that may not be able to otherwise see of the traveling of track, track map, user etc..User can It is given the condition of outdoor environment, temperature, weather, sowing condition, fishing condition, water level, tidal conditions etc..User can make It is communicated with glasses, coordination, weather alert etc. of the opsition dependent related with outdoor environment to group.User collects letter Breath, is named to identify plant, trees, animal, birds, sound, bird.In embodiments, user can check object simultaneously And by inquiring to glasses " what this is ", user can be presented content and/or information about the object.In each embodiment In, user can get security information, and whether some thing is edible, toxic, dangerous etc..For example, user can Propose problem " this is dangerous snake? " when seeing by glasses, glasses then can be used for providing a user about this The information of snake, it has venom etc..In embodiments, user glasses can be used identify and/or receive and with open air The related content of continental embankment of environmental correclation connection.Such continental embankment can help user to navigate or understand in the environment environment.In each reality Apply in example, glasses can be used to check program-guide in user, such as how pitch a tent, make a call to one specifically tie, cross it is difficult Landform etc..User can inquire that " how I lift this tent " and user can receive the gradually instruction about this.In each implementation In example, user can check about itself, the content of behavior or situation, or analysis thus.User can request to update from glasses, all Such as " I has been dehydrated ", " my hypothermia ", " my oxygen content is low ".According to as a result, user can be changed his behavior come It prevents specific result or promotes specific result.In embodiments, user can check and other people experience on track Related social content and environment, experience blog etc..In embodiments, it is only for row that user, which can be alerted some ski trail, Family or user can further be informed the present situation, such as various pieces in place have serious ice patch.
In embodiments, glasses can be used as in environment related with business by user outdoors.User can download and environment Related related content.As an example, user can download diameter road map, fishing map, about grabbing fish, skiing, Halfpipe etc. Data.User can arrange to stay, order supply, rental equipment, arrange guide, visit, entry event, obtain for example for fishing The license of fish, go hunting licensing or other etc..In such setting, user can interact via glasses and social networks, For example, user can participate in training club, with other on diameter road or people in specific environment communicate etc..With It is the achievement being directed toward that family, which can mark and/or track with target,.For example, user is traceable or marks the target for climbing Mount Whitney, it can Mark the target etc. of charitable " happy race ".The business prototype etc. based on blog can be used in user.In embodiments, user can be through The subsidy for some outdoor event is improved using social networking by eyewear platform.
In embodiments, user can will be in outdoor environment or content related with outdoor environment, data etc. are input to eye In mirror.In embodiments, the camera in glasses can be used to be used for scenery identification in user, and user can use GPS by glasses To provide information related with specific environment or navigate in specific environment.User can send to the other users in ambient environment Communicate and receive from they communication or send and receive communication related with environment.User can input continental embankment data, use AR enhances the continental embankment etc. for checking environment.User can input leaf and the features such as spend, and make label related with these features, The picture of these features can be captured and/or understand them in the environment.User can capture the image of the item of composing environment, animal etc. Come learn more about they, related with them data of storage, with and their related AR contents interact.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes: for determining Eyepiece is just adjacent to the module of outdoor environment;The optics assembly of outdoor environment around is checked by its user;For in wear-type Processing module of the rendering for the display of the outdoor content of the outdoor environment of eyepiece on eyepiece;For capturing and handling wear-type mesh Image-processing module of the image of the environment of the wearer of mirror, the processing include identifying feature related with event and storing to be somebody's turn to do The position of feature;And the integrated image source for content to be introduced into optics assembly, wherein integrated image source will be in open air Appearance is rendered into the covering environmentally checked by the eyepiece wearer and content is associated with this feature, wherein integrated image source Content related with outdoor environment is presented.In a further embodiment, display elements can be locked in by image processing module In the identificated feature of environment, and content can be presented with the relationship of the identificated feature in opposite display.In embodiments, Content can be used as the result of at least one of to be rendered: fix in the environment into outdoor environment, by the eyes of user One presence upper, the feature in environment-identification, one people of identification are in environment, input environment image, focus in the environment Indicate first-class.In embodiments, content may include enhancing ambient Property, including at least one of: covering diameter road information arrives The temporal information of destination, user's forward information, landmark information, the security information about environment, in environment relative to other sources Position and information about the organism in environment.In embodiments, content may include the instruction for user, and Instruction can be at least one of: audio, video, image, 3D rendering, the covering on object, gradually instruction etc..User Ke Tong Cross that eyes movement, hand gesture, point are at least one of first-class to be interacted with content.User it is executable following at least it One: arranging to stay, order supply, rental equipment, arrange excursions, obtaining for movable license or licensing, input about ring The comment etc. in border.In addition, content can enhance at least one of: camera input, GPS information, the continental embankment in environment and open air Feature in environment.In embodiments, eyepiece be used to identify the presence of a people in the environment, and present and wearer and institute Know the related social networking content of relationship between others.In addition, user can be by making posture with his position of body To send and receive at least one of friend request.In embodiments, content can be according to the analysis to user's situation by wash with watercolours Dye.In embodiments, one feature of identification may include at least one of: to the automatic processing of the image comprising this feature, This feature is checked with signal, is communicated with this feature, this feature is identified by handling the position of this feature, from database It is specified etc. to retrieve user about the information of this feature, this feature.In addition, user can by the user interface interaction with eyepiece come A specified feature is for keeping overlay content.In embodiments, covering in known another characteristic or neighbouring can be identified Feature presentation content.In addition, known another characteristic can be at least one of: plant, trees, shrub, diameter road, rock, Fence, path, field, campsite, cabin, tent, the mode of water transportation, marine vehicle and animal.
In embodiments, user can use glasses in exercise environment.In embodiments, glasses can be used in user It checks, download or is interacted in other ways with content.For example, user can such as by inquiry glasses " I is dehydrated? ", " my hypothermia? ", " my oxygen content is low? " etc. taking ego behavior or status analysis.In embodiments, user Ke Cha See the content towards health club, club's expense and offer, upcoming training course etc..User can check face To trained content, content is such as instructed and indicated.For example, user can check to squat, stretch, how to use Instruction, video, AR of equipment etc. etc..User can check, comment on and update blog, personal exercise such as related with exercise environment Blog.
In embodiments, user can use glasses in the business in exercise environment.As an example, user can be by paying Money or free download master plan, such as with instruction, coach or the related plan of other guidances.In embodiments, user It can be tracked successfully and/or be in progress by planning, until terminating.In embodiments, application can have associated wants with them It is displayed to the advertisement of user.In embodiments, glasses can be used to be used for ancillary equipment purchase and sale in user.For example, The commercially available new sport footwear for increasing arch of foot insole with running of user.In embodiments, glasses can be used to use in user In charitable activity, such as, but not limited to " happy race " or " climbing Mountain Everest for charities X ", wherein user is via glasses Platform collectoin and/or check or update the blog entries for this.
In embodiments, information and/or data that user can be used glasses to input in exercise environment.In each embodiment In, user can enter data to for showing tracking, via sensor input data and input picture and video.It is only used as and shows Example, user can record a video to another person in some activity, and the video is then used during the training of himself To make form, technology etc. become perfect.
In embodiments, system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes: for determining Eyepiece wearer is just in exercise environment or just adjacent to the module of at least one of exercise environment;Surrounding is checked by its user The optics assembly of exercise environment;Processing module for the rendering exercise related content on wear-type eyepiece;For capture and Handle image-processing module of the image of the environment of the wearer of wear-type eyepiece and the feature of environment-identification;For by content It is introduced into the integrated image source of optics assembly, wherein exercise is rendered by integrated image source is checked environmentally by user Covering, be fixed to neighbouring known another characteristic when wherein being covered on the mobile eyepiece of user, and wherein integrated image source is in Now content related with exercise environment.In embodiments, the rendering of content can be the result of at least one of: enter forging Refining environment, the eyes of user are fixed on one of environment it is upper, automatically identify feature in the eyepiece visual field, in exercise environment Using in an equipment, environment-identification label, to focus on mark in environment first-class.In embodiments, content may include Enhance exercise, including at least one of: towards trained content, club information content, the instruction for exercise, pass In the information etc. of upcoming course.In embodiments, content may include 3-D content, audio, vision, video and text At least one of content.User can hand over by the way that eyes movement, hand gesture, point are at least one of first-class with content Mutually.In embodiments, content may include user information, including vital signs heart rate, exercise time, once come in swimming contest Return at least one of required time, best setting time, historical use data etc..Content allows user to buy training Course, using the time of machine, in club's more times, beverage, health etc..In embodiments, content can be The advertisement of at least one of: upcoming course, Health Club, to every discount of juice bar, equipment sale etc.. In addition, in embodiments, eyepiece can be used for social networking, wherein eyepiece provides at least one of: user is to environment Comment and to the face recognition of people another in environment.In addition, user can be sent out by making posture with a position of his body At least one of give and receive friend request.In embodiments, user can send out to another member, coach, director etc. It send friend request or receives friend request there from them.In embodiments, covering can be in known another characteristic or neighbouring Known another characteristic presentation content.In addition, known another characteristic can be at least one of: calendar, wall, window, plate, mirror Son, weighing machine, bicycle, stationary bicycle, elliptical machine (one of gymnastic equipment), prize ring bag, runway, is scored at treadmill Plate, goal, a region in court, a region in tennis court etc..In embodiments, identification one feature may include with down toward It is one of few: to check this feature to the automatic processing of the image comprising this feature, with signal, communicated, led to this feature The position of processing this feature is crossed to identify this feature, specify from database retrieval about the information of this feature, the user of this feature Deng.In embodiments, user can specify a feature by interacting with the user interface of eyepiece etc. to be used to keep in covering Hold.
It may be the mobile online game using augmented reality eyes to the attractive another application of user.These game Can be computer video game, such as those are mobile by Electronic Arts Mobile(electronics artistic skill), UbiSoft and The dynamic view severe snow of Activision Blizzard() supply game, for example, World of(WoW) (Warcraft generation Boundary).It is played on home computer (the calculation machine and inoperative is used tricks) just as game and entertainment applications, augmented reality glasses can also be used Game application.Screen may alternatively appear in the inside of glasses so that the game of user's Observable and participation game.In addition, void can be passed through Intend game console (such as control stick, control module or the mouses of other place descriptions here) to provide the control to game is played System.Game console may include the element of sensor or other output types for being attached to user's hand, such as coming from The feedback by acceleration, vibration, power, pressure, electric pulse, body temperature, electric field sensing etc. of user.Sensor and actuator can pass through The modes such as sheath, ring, liner, gloves, bracelet are attached to the hand of user.Eyepiece virtual mouse allows user as a result, The movement of hand, wrist and/or finger is construed to movement of the cursor on eyepiece is shown, wherein " moving " may include slowly moving Dynamic, quickly movement, jerky movements, positioning, change in position etc., and permissible user operates without physics in three dimensions Surface, and including some or all of six-freedom degree.
As seen in Figure 27, both 2700 usable internets and GPS are realized in game application.In one embodiment In, game is via game provider's (perhaps by using their web services and internet of such as display) from customer database Download to subscriber computer or augmented reality glasses.At the same time, it may have the glasses of telecommunication capability are via cellular tower and satellite Receive concurrent power transmission letter and telemetered signal.Online game system, which is able to access that, as a result, wants about user location and user The information of ludic activity.
The knowledge of the available position about each player of game.For example, game can be via GPS locator or magnetometer Locator building uses the feature of player position, for rewarding plot point when reaching the position.Game can also reach special in player Message (such as display clue) or scene or image are sent when positioning is set.For example, message can be destined to next be provided to Next destination of player.Scene or image can be used as a part of the struggle or obstacle that must be overcome or swim as getting The chance for plot point of playing provides.Therefore, in one embodiment, the position of wearer can be used for augmented reality eyepiece or glasses Accelerate and enlivens computer based video-game.
It is a kind of play augmented reality game method describe in Figure 28.In this method 2800, user logs on to website, by This accesses the game being licensed.Game is selected.In one example, if multiple players are available and cater to the need , then game can be added in user;Alternatively, user can perhaps create customized trip by using special role desired by user Play.Game can be arranged the time, and in some cases, and specific time and place for game may be selected in player, will refer to Show to be assigned to and will play the website etc. of game.Later, player is met and is stepped on one or more players using augmented reality glasses Record game.Participant then plays game and if applicable, game result and any statistical data (score of player, trip Play number etc.) it can be stored.Once game has begun, position can change for different players in gaming, thus One player is sent to a position and another player or other multiple players are sent to different positions.Game then can be according to every The position that the GPS or magnetometer of a player or every group of player are provided has the different scenes for each player or every group of player. Each player can also be sent different message or image according to his or her role, his or her position, or both.When So, each scene then can lead to other scenes, other interactions, to the instruction of other positions etc..In one sense, such Game mixes the game that the reality of player position is being participated in player.
Game can extend from the easy game type (such as small-sized, single player) that can be played in player's palm. Alternatively, more complicated, multiplayer game can also be played.At the former, classification is game, such as SkySiege, AR Drone With Fire Fighter360.In addition, multi-player gaming is also readily appreciated.Since all players must log in game, some Game can be logged and the friend of other one or more people is specified to play.The position of player can also can via GPS or other methods With.Sensor (such as accelerometer, gyroscope or even magnetic sieve in augmented reality glasses or game console described above Disk) it may be alternatively used for orienting and playing game.Another example is AR invader (Invader), in the iPhone from application shop It can be used in.It can be from other suppliers and for (France bar, the Layar and Parrot company in such as Amsterdam Multitude) (i.e. the provider of AR Drone, AR Flying Ace and AR Pursuit) provide non-iPhone type system, obtain Other game.
In embodiments, game can be 3D's so that user can experience 3D game.For example, when playing 3D game When, user can check virtual, augmented reality or user in other environment at the visual angle that can wherein control him.User can be rotated him Head check the various aspects of virtual environment or other environment.As a result, when user rotates his head or makes other shiftings When dynamic, he can check game environment, just look like he actually just in this environment.For example, the visual angle of user can be with It is so that user is stayed " in " 3D game environment by least some control to visual angle, wherein user can move his head Portion and the view changed with game environment corresponding with the head position of change.In addition, when user actually walks forward When, he can " entering into " game, and have as the mobile visual angle of user changes.In addition, with the note of his mobile eyes of user Depending on view, visual angle also be can be changed.It can be such as in the additional figure of lateral position offer in the user visual field accessed by rotation head As information.
In embodiments, 3D game environment can be projected on the lens of glasses or be checked by other means.This Outside, lens can be opaque or transparent.In embodiments, 3D game image can be associated with the external environment of user And the external environment of user is combined, the head for rotating him and 3D rendering are enabled a user to together with external environment.This Outside, such 3D game image is associated with changeable with external environment, so that 3D rendering is in all cases and in external environment More than one object or an object more than one position it is associated so that user is appearing to be 3D rendering It is interacted with the various aspects or object of true environment.As an example, user can check 3D game strange beast climb up a building or Automobile is climbed up, wherein such building or automobile are the real objects in user environment.In such game, user can be with monster A part that beast interacts as 3D gaming experience.True environment around user can be one of 3D gaming experience Point.In each embodiment that wherein lens are transparent, when user can move around in his or her true environment with 3D game Environment interacts.The element of user environment can be integrated in game by 3D game, it can fully be assembled by game or it It can be the mixing of the two.
In embodiments, 3D rendering can be associated with augmented reality program, 3D Games Software etc. or existing by enhancing Real program, 3D Games Software etc. generate by other means.Made in wherein augmented reality for the purpose of 3D game In each embodiment, 3D rendering can occur or be easily noticed by the users according to the position of user or other data.Such enhancing Practical application is provided to user to interact with such 3D rendering, to provide 3D game environment when using glasses. As user for example changes his position, the process in game can advance and each 3D element of game can become viewer It may have access to or inaccessible.As an example, the various 3D enemies of the game role of user can go out according to the actual position of user Now in gaming.User can play the user and/or 3D associated with the user that other are playing game of game with other Element interacts or causes its reaction.Such element associated with the user may include weapon, message, currency, user 3D rendering etc..According to the position of user or other data, he or she can encounter in any manner, checks, participate in other use Family and 3D element associated with other users.In embodiments, 3D game can also be by being mounted in glasses or downloading It is provided to the software of glasses, the position of user is used or not used in glasses.
In embodiments, lens can be opaque to provide a user virtual reality or other virtual 3D game bodies It tests, wherein user " places oneself in the midst of " in game, and wherein the visual angle of the 3D game environment of user can be changed in the movement of user.User Ke Tong Following manner is crossed to move around or explore in virtual environment and thereby play 3D game: various bodies, head and/or eyes move It is dynamic, the use of game console, one or more touch screens or permissible user described herein navigate to 3D environment, Any control technology of manipulation and interaction.
In various embodiments, user can navigate, interact and manipulate simultaneously body to 3D game environment in the following manner Test 3D game: body, hand, finger, eyes or other movements, by using one or more wired or wireless controllers, one A or multiple touch screens, any control technology described herein etc..
In embodiments, inside and outside facility available to eyepiece can be used for learning the behavior of eyepiece user and incite somebody to action The behavior learnt is stored in behavior database to enable location aware control, activity aware control, PREDICTIVE CONTROL etc..Example Such as, user may make event and/or be recorded to the tracking of movement by eyepiece, such as from the user to order, pass through camera sensing The GPS location of the image, user that arrive, with the time sensor input, by user triggering movement, to and from user's Communication, user's request, web activity, the music listened to, the instruction of request, used or recommendation provided etc..This behavior number According to that can be stored in behavior database, such as with user identifier or automatically mark.Eyepiece can be in mode of learning, collection This data are collected in mode etc..The passing data that eyepiece can be made using user inform or remind the user did before them What, or alternatively, eyepiece can predict what eyepiece user may need according to the experience of passing collection using the data Function and application.In this way, eyepiece can be used as the automation assistant of user, for example, when user usually starts application Start they, when close to a position or while entering building close augmented reality and GPS, spread when user enters gymnasium it is defeated Music etc..Alternatively, the behavior learnt and/or movement of multiple eyepiece users can be automatically stored in common behavior number According in library, wherein the behavior learnt between a plurality of users can be available to each user based on similar situation.For example, user A city can visited and waiting train on platform, the eyepiece of user accesses common behavior database to determine other use Family done when waiting train it is a little what, such as obtain instruction, search for interested place, listen to certain music, search train The public place of entertainment in the region is known to obtain travel information, be connected to social networking site in timetable, connection city website Deng.In this way, eyepiece can provide a user the help of automation by benefiting from many different user experiences. In embodiments, the behavior learnt can be used for improving the preference profile for user, recommendation, advertisement positioning, social network Network contact person, user or behavioral profiling of user group etc..
In one embodiment, augmented reality eyepiece or glasses may include passing for detecting one or more sound of sound Sensor 2900.More than one example describe in Figure 29.In some sense, acoustic sensor is similar to microphone, because it All detect sound.One or more frequency bandwidths that acoustic sensor usually has them more sensitive, and sensor energy Therefore it is selected for the application wanted.Acoustic sensor can obtain from various manufacturers and in combination with appropriate Frequency converter and other desired route are available.Manufacturer includes ITT Electronic Systems(ITT electronic system) (the U.S. Utah State salt lake city);Meggitt Sensing Systems(Meggitt sensor-based system) (California, USA is holy recklessly Amp- Ka Pisitelanuo);And National Instruments(American National instrument) (texas,U.S Austin).Properly Microphone include those include single microphone microphone and those include the microphone or microphone of microphone array Array.
Acoustic sensor may include the sensor that those use Micro Electro Mechanical System (MEMS) technology.Due to MEMS sensor Interior very delicate structure, sensor is very sensitive and usually has extensive sensibility.MEMS sensor is usually to pass through half Made of conductor manufacturing technology is come.One element of typical MEMS accelerometer be by two groups of finger sets at dynamic girder construction.One Group is fixed to the solid ground level of substrate;Another group is attached to and known be mounted on and can move in response to the acceleration of application Block on dynamic spring.This acceleration applied changes the capacitor between fixed and dynamic beam finger beams.The result is that very quick The sensor of sense.Such sensor is for example by STMicroelectronics(STMicw Electronics) (Dezhou Austin) and Honeywell International(Honeywell Int Inc) (New Jersey Morrison town) manufacture.
In addition to mark, the sound capabilities of augmented reality equipment can also be used to the origin of location sound.As is generally known, Two sound or acoustic sensor are at least needed to carry out location sound.Acoustic sensor will be equipped with frequency converter and signal appropriate Processing circuit (such as digital signal processor) is come for explaining signal and completing desired target.For sound alignment sensor One application can be from emergency location (building, traffic accident for burning etc.) determine sound origin.Equipped with This description embodiment first-aid personnel can each have one or more than one being embedded in shelf acoustic sensor or Microphone.Certainly, sensor is also wearable on the clothes of people or is even attached on the person.Anyway, signal is transmitted To the controller of augmented reality eyepiece.Eyepiece or glasses are equipped with GPS technology and can also be equipped with bearing measurement ability;Replacement Ground, by everyone two sensors, microcontroller can determine that the direction of noise origin.
It is if there is two or more fire fighters or other emergency reactions personnel, then knowable from their GPS ability Their position.Any of two or Chief Fire Officer or control general headquarters then know two reaction personnel position and from Every reaction personnel are to the direction of the noise detected.The exact originating point of noise then can be by using known technology and calculation Method determines.See, for example, Acoustic Vector-Sensor Beamforming and Capon Direction Estimation(acoustic vectors-sensor beam forming and Capon direction estimation, M.Hawkes and A.Nehorai, IEEE letter Number processing transactions, roll up the 46, the 9th phase, in September, 1998, the 2291-2304 pages);See also Cram é r-Rao Bounds for Direction Finding by an Acoustic Vector Sensor Under Nonideal Gain-Phase Responses, Noncollocation or Nonorthogonal Orientation(are used to ring in undesired gain phase Answer, it is non-arrangement or nonopiate orientation under by acoustic vectors sensor carry out direction finding Cram é r-Rao circle, P.K.Tam and K.T.Wong, IEEE sensor magazine roll up the 9, the 8th phase, in August, 2009, the 969-982 pages).The technology used may include timing Difference (reaching time-difference of institute's sensor parameter), velocity of sound difference and acoustic pressure difference.Certainly, acoustic sensor usually measures acoustic pressure Grade (for example, as unit of decibel), and these other parameters can be used in the acoustic sensor of appropriate type, including Acoustic emission sensor and ultrasonic sensor or frequency converter.
Algorithm appropriate and all other necessary programming can be stored in the microcontroller of eyepiece or store In the addressable memory of eyepiece.By using more than one reaction personnel or several reactions personnel, then can determine Possible position, and react personnel and can attempt to position personnel to be succoured.In other applications, these can be used in reaction personnel Acoustic capability come determine concern personage position to enforce the law.In other applications, multiple personnel in manoeuvre can encounter enemy Fang Huoli, including direct firepower (sight line) or indirect firepower (outside sight line, including high-angle fire).It is described herein identical Technology can be used for the position for estimating enemy firepower.If there is several personnel in region, estimation can be more accurate, especially It is in the case where several personnel are separated at least some range in wider array of region.This can be for drawing for enemy Lead the effective tool of anti-artilleryman or counter mortar firepower.If target is close enough, direct firepower be can be used as.
Described in Figure 29 B using an example of augmented reality eyepiece embodiment.In example 2900B, Shuo Geshi In patrol, everyone soldier alarm equipped with augmented reality eyepiece and for enemy firepower.Acoustic sensor or wheat by them The sound that gram wind detects can be passed to squad's vehicle such as display, their platoon leader or long-range tactical operations center (TOC) Or command post (CP).Alternatively or cumulatively, signal also may be sent to that mobile device, the airborne platform such as shown. Communication between these soldiers and additional position can be promoted by using local area network or other networks.In addition, all passed The signal sent can be protected by encryption or other safeguard procedures.One in squad's vehicle, platoon leader, mobile platform, TOC or CP It is a or it is multiple will have integration capability, for the input from several soldiers being combined and being determined the possibility of enemy firepower Position.Signal from every soldier will include the soldier position from the intrinsic GPS ability of augmented reality glasses or eyepiece.Often The acoustic sensor of position soldier can indicate the possibility direction of noise.By using the signal from several soldiers, enemy firepower Direction and possible position can be determined.Soldier can then eliminate the threat of the position.
Other than microphone, augmented reality eyepiece can be equipped with earplug, can as other places herein are mentioned To be hinge earplug, and it can be movably additional 1403 or can be equipped with audio output consent 1401.Eyepiece and earplug It can be equipped to delivering noise and eliminate interference, so that user be allowed preferably to hear the audio-from augmented reality eyepiece or glasses The sound delivered in video communication ability, and can be to automatic gain control feature.The loudspeaker or earplug of augmented reality eyepiece It can also be connect with the full acoustic frequency of equipment and visual capabilities, with high-quality of the delivering from included telecommunication apparatus and clearly sound The ability of sound connects.As other places herein are mentioned, this includes radio or cellular phone (smart phone) audio energy Power, and may also include supplementary functions, such as Bluetooth for wireless personal area network (WPAN)TM(bluetooth) ability or correlation Technology, such as IEEE802.11.
The another aspect for enhancing audio capability includes speech recognition and identification capability.Speech recognition has been focused on illustrating assorted And voice identifier focuses on understanding speaker that whom is.Voice identifier can be used in combination with the face recognition ability of these equipment to be come Mark concern personage more for certain.As described in other places herein, what a part as augmented reality eyepiece connected Phase function insignificantly focuses on desired personnel, multiple faces in single people or crowd in such as crowd.By using The image of camera and facial recognition software appropriate, people or personnel can be obtained.Each feature of image is then divided into arbitrarily The measurement and statistical data of quantity, and result is compared with the database of known people.Then identification can be made.With Same mode can get voice or speech sample from concern personage.Sampling can be on for example specific time interval be made Mark marks, and the descriptions of the physical features or number of such as employment marks.Speech sample can be with known people Database is compared, and if personnel voice match, identification can be made.In embodiments, multiple interested Individual can be selected such as biological identification.Multiple choosings can be made by using cursor, hand gesture, eyes movement etc. It selects.As multiple selections as a result, the information about selected individual can be provided to use by showing, by audio etc. Family.
It is used for the control in each embodiment of the bio-identification of multiple people, being described herein in crowd in wherein camera Technology can be used for selection face or iris for imaging.For example, the cursor selection using handset type control equipment can be used for Select multiple faces in the visual field of user surrounding environment.In another example, watch tracking can be used for which face selected attentively It selects to be used for biological identification.In another example, handset type control equipment can sense the posture for selecting individual, such as It is directed toward each individual.
In one embodiment, the key property of the speech of someone can be adopted from a sampling or from the multiple of the people's voice Understand in sample.Sampling is often divided into segment, frame and subframe.In general, key property includes the basic frequency of the voice of people, energy Amount, formant, speech rate etc..These characteristics are analyzed by analyzing the software of voice according to specific formulation or algorithm.This A field just constantly changes and improves.However, current this classifier may include such as following algorithm: neural network classification Device, k- classifier, hidden Markov model, gauss hybrid models and pattern matching algorithm etc..
The general template 3100 for speech recognition and speaker identification is described in Figure 31.The first step 3101 is to provide language Sound signal.It is desirable that a people has from the known sampling previously met with, which can be compared with known sampling. Signal is then digitized in step 3102 and is divided to segmentation, such as segment, frame and subframe in step 3103.Speech samples Feature and statistical data be then generated and extract in step 3104.Classifier or more than one classifier are then in step 3105 are applied to determine the general classification of sampling.The post-processing of sampling will then can be sampled in step 3106 by application examples Tathagata It is compared with known sampling for finding possible matching and mark.As a result it can then be exported in step 3107.The output It is directed into the matched people of request, and can also be recorded and be sent to other people and one or more databases.
In one embodiment, the audio capability of eyepiece includes the hearing protection using associated earplug.At the audio of eyepiece Reason device can enable automatic noise suppression, in the case where loud noise is such as detected near the head of wearer.It is described herein Any control technology in combination with automatic noise suppression come using.
In one embodiment, eyepiece may include Nitinol headband.Headband can be curved metal sheet band, can otherwise from It extracts or rotates out of in the mirror holder of eyepiece and reach and eyepiece is fixed to head on rear side of head.In one embodiment In, the end of Nitinol headband can have silica gel sheath, pull out so that the silica gel sheath is grasped from arm end.In each embodiment In, only one arm has Nitinol band, and it is fixed to another arm to form headband.In other embodiments, two Arm all has Nitinol band and two sides are drawn out to be either formed together headband or individually catch one of head Divide and eyepiece is fixed on the head of wearer.In embodiments, eyepiece can have interchangeable device to adhere to eyepiece To the head of individual, headband, spectacle frame, helmat belt, the helmet buckle the connector that connection etc. can be attached to.For example, close to use There may be a connectors at the temple of family, and eyepiece may be affixed to headband at this, and headband can be disconnected at this, from And user can adhere to spectacle frame, come so that eyepiece has the form of glasses, is attached to the helmet etc..In embodiments, by mesh It may include flush type antenna that mirror, which is attached to user's head or the interchangeable device of the helmet,.For example, Nitinol headband can have inside Embedded antenna, for specific frequency, for multiple frequencies etc..In addition, mirror holder, headband etc. may include RF absorption bubble Foam material is with the absorption of the help RF energy when antenna be used to transmit.
With reference to Figure 21, eyepiece may include the adjustable sheath of one or more around scalable mirror holder 2134.Around can receive The position of eyepiece can be fixed on the head of user by the adjustable sheath of contracting mirror holder 2134.One in scalable mirror holder 2134 or It is multiple to can be made of shape-memory material.In embodiments, one or two of mirror holder is by Nitinol and/or to appoint Made of what shape-memory material.In other cases, around the end of at least one of the sheath of scalable mirror holder 2134 Silicone resin can be coated with.In addition, the adjustable sheath around scalable mirror holder 2134 can be stretched from the end of eyepiece mirror holder 2116 Out.They can telescopically be stretched out and/or they can be skidded off from the end of eyepiece mirror holder.They can be out of eyepiece mirror holder 2116 Portion skids off or they can be slided along the outer surface of eyepiece mirror holder 2116.In addition, scalable mirror holder 2134 can contact with each other and consolidate It is fixed.Scalable mirror holder is also attached to another part of wear-type eyepiece to create the dress for eyepiece to be fixed to user's head It sets.It can be fixed to each other, interlock otherwise around the sheath of scalable mirror holder 2134, connecting, magnetically coupling or is logical It is fixed to provide the fixed attachment of user's head to cross other means.In embodiments, around scalable mirror holder 2134 can Sheath is adjusted also to be individually adjustable to be attached to the head of user or catch each position of user's head.It as a result, can be single The mirror holder solely adjusted allows user to have the customizability of promotion for personalized adjustment eyepiece is fixed to user's head. In addition, in embodiments, at least one around the sheath of scalable mirror holder 2134 can be separated from wear-type eyepiece.At it In its embodiment, it can be the supplementary features of wear-type eyepiece around the sheath of scalable mirror holder 2134.In this case, it uses Family may be selected scalable, non-telescoping or other mirror holders being placed into wear-type eyepiece.Allow user will for example, mirror holder can be used as Eyepiece is customized to meet a part of the external member of his or her certain preference or external member to sell.Therefore, user can pass through selection The different external members of specific scalable mirror holder with the preference for being appropriate to him can be made into customize around scalable mirror holder 2134 The material type of adjustable sheath.Therefore, user can customize his eyepiece for his specific needs and preference.
In other embodiments, adjustable headband 2142 can be attached to eyepiece mirror holder, so that it surrounds user's head Rear portion extend so that eyepiece is fixed on suitable position.Headband can be adjusted to suitable position appropriate.It can be by any conjunction Suitable material is made, including but not limited to rubber, silicone resin, plastic cement, cotton thread etc..
In one embodiment, eyepiece can be fixed to the head of user by a variety of other structures, such as rigid mirror holder, Flexible mirror holder, gooseneck bending mirror holder, cable tension system etc..For example, flexible mirror holder can be from hose such as in gooseneck configuration Building, wherein flexible mirror holder can be bent to position to be adjusted to be suitble to given user, and wherein flexible mirror holder can be fixed again on demand Shape.In another example, such as in robot finger's configuration, flexible mirror holder can be constructed from cable tension system, the flexibility mirror Frame has the connector of multiple connection component parts, and each component part is applied by being applied to by the cable of each connector and component part If pulling force be manipulated into curved shape.In this case, cable traction system can realize hinged ear angle for big ditty Section and eyepiece headwear keep cable tension system that can have two or more connections, and cable can be stainless steel, based on NiTi It is promise, electric actuation, ratchet, wheel adjustment etc..
Each embodiment of cable tension system 17800 is shown in Figure 178-179A and B.In embodiments, cable Force system may include ear angle 17802, which is made of the connector of two or more connection component parts, each to form Part is manipulated into curved shape by being applied to the tension of the cable 17804 across each connector and/or component part.Scheming At the erection position shown in 178, ear angle can be positioned straight along user's head.Cable 17804 can pass through 17808 quilt of adjuster Thus attachment and tension position adjuster to promote the tension in cable 17804 and ear angle is caused to be bent or curved Meet the shape of user's head.By increasing such tension, ear angle can fasten and/or become more rigid.By according to The setting in account portion, ear angle 17802 can be directed to by the way that eyepiece is securely maintained at user's head specific user's head into Row adjusts and/or eyepiece is assisted to keep.In embodiments, as the tension of cable 17804 increases, ear angle becomes more rigid Or less relax to position to user's head, and as the tension in cable 17804 discharges, ear angle becomes more flexible, thus One or two of ear angle is allowed to stretch and/or fold.In embodiments, adjuster 17808 can be ratchet, electroluminescent It is dynamic, wheel adjustment including wedge block etc..In embodiments, wedge block can be taper regulating member, can pass through Pull ring etc. is drawn in or is pulled out, adjusting is provided, allow the position for the one or more parts for making ear angle and/or eyepiece be raised or Decline.In embodiments, as shown in Figure 179 B, ear angle 17804 may be configured to robot finger and configure and shape.Herein The adjustable ear angle of description can provide Foldable convenient coat so that eyepiece is fixed to user's head by easy-to-use while offer Benefit.In embodiments, ear angle can provide the design on package head, wherein the head of the ear angle package user at left and right sides of eyepiece Portion and in contact with or close in contact user's head rear portion.In embodiments, ear angle can mutually hitch increased to provide It is fixed.Such hitch can be realized by the magnet on each ear angle, hitch gear on ear angle etc..In embodiments, Ear angle can partially or even wholly wrap up the head of user or mutually agree with the contouring head of user and/or they can pass through edge The side of user's head and/or be fixed on user's ear rear to be fixed to the head of user.In embodiments, ear angle can It is attached to the earphone of eyepiece, the earphone 2104 shown in such as Figure 22.Ear angle can permanently or removably be attached to earphone.? In each embodiment, as shown in Figure 180, ear angle may include the earphone of eyepiece a part or it may include entire earphone (not It shows).In embodiments, adjuster 17808 can be close adjacent to part, the ear angle of eyepiece positioned at ear angle or passes through user's ear Piece end or ear angle and/or any other position of eyepiece.In embodiments, as described in this, one in ear angle or Two are adjustable.In each embodiment as described in this, as shown in Figure 184, ear angle (only shows own and does not have Have eyepiece) head of user can be wrapped up and/or mutually agreed with the contouring head of user.
In embodiments, the changeable gravitation between layers multiple in laminate can be used for ear angle.For example, one or Multiple ear angles may include the layer in laminate and the gravitation between layer may be from magnetic, electrostatic and/or vacuum plant.Each In embodiment, magnet can be used in the following manner: magnetic pole being rotated to attraction or repels position to allow in laminate Each layer attract each other so that ear angle fastening and it is mutually exclusive so that ear angle relaxation.Each layer of laminate leans on wherein In each embodiment together, voltage can be applied to generate the electrostatic attraction that can be electrically switched.As gravitation is generated, ear angle can Fastening.When voltage is removed or electrostatic attraction is switched, ear angle can relax.It in embodiments, can be by laminated by two Vacuum is generated together, the two layers are combined together and have rebound, this time in one or more parts of each layer Chamber or space are generated between each layer of bullet to generate vacuum.As each layer is summed, they may make ear angle firm.Vacuum Sealing can be broken to allow the relaxation of ear angle.In various embodiments, since ear angle is firm, they can provide eyepiece with account More rigid and/or fixed holding in portion.In embodiments, ear angle can partially or even wholly wrap up user head or Mutually agree with the contouring head of user and/or they can be by the side along user's head and/or after being fixed on user's ear Side is fixed to the head of user and/or the rear portion of user's head.As electrostatic potential, polarity and/or vacuum are adjusted, Ear angle firm can allow the ear angle to be fixed to the head of user, and ear angle can relax or discharge and close to stretch and/or fold into Coincidence is set.
In embodiments, one or more ear angles may include interior bar and/or cable architecture, wherein each ear angle is further wrapped Include magnet.The magnet at each ear angle can be connected with each other, to allow the head of two ear angles package users.Magnet is interconnected It acts and allows line and/or interior bar structure draws taut, be suitble to provide to the more constant of user's head.In embodiments, By connecting magnet, harder head and/or ear angle to allow ear angle package user can be erect or be become to the interior bar at ear angle Inner wire can be tightened and allow the head of ear angle package user.In embodiments, ear angle can partially or even wholly wrap up use The head at family is mutually agreed with and/or they by the side along user's head and/or can be fixed on the contouring head of user User's ear rear is fixed to the head of user.When magnet is not connected, ear angle can stretch and/or can be folded.
In embodiments, one or more ear angles can be using the intracavitary air pressure inside ear angle, can firm ear angle.Gas Pressure can be increased to firm ear angle.It is such it is firm can eyepiece using when allow ear angle be adjusted and/or wrap up user's Head.In embodiments, ear angle can partially or even wholly wrap up the head of user or mutually agree with the contouring head of user, And/or they by the side along user's head and/or can be fixed on user's ear rear and be fixed to the head of user.Gas Pressure can be lowered come the ear angle that relaxes.When ear angle is relaxed, they can stretch and/or be folded.Air pressure can be placed into use It is conditioned before or after taking away in account portion or from user's head.In embodiments, air pressure can by by finger pressing or Pump in the side frame of other way operation is adjusted.In embodiments, pump can via the user interface being shown in glasses or It is adjusted via other means.
In each embodiment being described herein, the solidness at ear angle can be related with cubic relationship to thickness.As showing Example, compared to single layer, the solidness of two not connected layers is up to twice, however if layer is connected to single layer, Combination layer with double thickness will have the solidness for being elevated 8 times.As further example, three individual layers are compared There is the solidness of three times in single layer, but be joined together will be firm with 27 times compared to single layer for three layers Degree.
In embodiments, as shown in Figure 181, one or more ear angles may include inside and outside position, thus Inner portion be formed from one of ear angle part and outer portion is formed from another part at ear angle.It is internal and outer Portion position can be formed from bifurcated in ear angle or in other ways from ear angle to constitute two individual positions, one of position It is outer portion and the other is inner portion.In embodiments, inner portion can contact and outside portion with the head of user Position can be contacted with inner portion.In embodiments, inside and outside position is interlockable, such as the embodiment described in Figure 182 Middle display.Inside and outside position may include interlock slot, tooth or other devices that they are interlocked or are tied.Top And/or outer portion may include pull ring or other protrusions, may make inside and outside position no longer to lock together by its user. In embodiments, each position can be bent to the head of user.In addition, inner surface opposite can outwardly push away.By mutual Inside and outside position is locked, the thickness at each position can be doubled.Therefore, by promoting the thickness of ear angular position, solidness can quilt It is promoted.In embodiments, by the thickness at ear angle of doubling, compared to single layer, solidness can be elevated 8 times.Strip outer layer Ear angular position can be returned to flexible state, so that ear angle be allowed to be folded.In embodiments, ear angle can pass through magnet, folder Son, suspension hook are attached to otherwise to be fixed to user's head.
In addition, in embodiments, as described in Figure 183, one or more ear angles may include three positions.At this In the embodiment of sample, ear angle may include the inside and outside position as described in reference Figure 181 and 182, however the embodiment may be used also Including intermediate position 18302, so that being explicitly made of three positions in ear angle such as Figure 183.Ear angle can further comprise one A or multiple buttons, hasp, interlock slot, tooth, nail or other devices lock together each position in ear angle.One of each position or Multiple may include pull ring or other protrusions, can be via discharging tooth or other devices for locking together each position by its user Come so that inside and outside position no longer locks together.In embodiments, compared to single layer, three not connected layers can have There is the solidness of three times, but when three layers are locked/linked together, compared to single layer, ear angle there can be 27 times of heavily fortified point Soundness.When three positions are not connected or are not locked together, ear angle can be flexible, so that they can be stretched And/or it folds.In addition, although each position is not locked together, each position can mutually be slided, so that them be allowed to be soft Property and be more easier to store when being not used, and when layer is locked together or is nailed together, they can not Mutually sliding.Each position at ear angle can reside in sheath, pipe or other structures including ear angle, so that individually each position It is not demonstrated.Notwithstanding tool, there are two the ear angles with three positions, but skilled artisans appreciate that each In embodiment, ear angle can be made of more than three positions and/or modified thickness.
In each embodiment described herein, package ear angle is foldable.(such as work as when ear angle is folded to closed position When user does not use eyepiece), ear angle can it is vertical so that they fold and ear angle package user's head and/or ear or with The ability that the profile of account portion and/or ear mutually agrees with can not interfere folding.In each embodiment being described herein, ear angle It can be folded and thereby vertically, thus allowing ear angle to become flat allows eyepiece to store with flat, configuration.In each embodiment In, when discharging at hinge or discharging otherwise, ear angle can be vertical, so that eyepiece be allowed to be folded.As retouched herein It states, in various embodiments, ear angle can become less rigid, so that them be allowed to fold.
In embodiments, leveller gasket can be used for one or more ear angles, enable them to as in not Ear with upright position provides the different location of the ear or eyes that adjust or solve user.In embodiments, gasket can It is placed on the contact point with user's ear at ear angle to be adjusted eyepiece to be suitble to user's ear and/or eyes not Same position.In embodiments, leveller gasket can be adjusted by wedge block or by various means.Leveller gasket can A part or leveller gasket to be ear angle can be attached to ear angle by clip, gluing, friction or other means.
In each embodiment being described herein, eyepiece and ear angle can be on the one or more regions contacted with user Closed-cell foam material is installed.Foamed material can provide a user comfort, while be also prevented from moisture and sweat infiltration foamed material. In addition, closed-cell foam material provides impunctate surface also to prevent eyepiece Carried bacteria, microorganism and other organisms and prevent Their growth.In each embodiment being described herein, foamed material can be it is antimicrobial and/or antibacterial and/ Or it is handled with the substance for such purpose.
In one embodiment, eyepiece may include security feature, such as M-Shield Security(M-Shield safety), When secure content, DSM, safe operation, IPsec etc..Other software feature can include: user interface, application, frame, BSP, volume Decoder, integrated, test, system verifying etc..
In one embodiment, eyepiece material can be selected to achieve reinforcing.
In one embodiment, eyepiece is perhaps able to access that 3G access point, 802.11b connection and bluetooth including 3G radio Connection, so that data can jump to the eyepiece embodiment for enabling 3G from an equipment.
Present disclosure also relates to the method and apparatus for capturing the biometric data about individual.The method and device The wireless capture of fingerprint to individual, iris patterns, face structure and other unique biometric features is provided, then will be counted According to being sent to network or be transmitted directly to eyepiece.From individual acquisition data can also compared with the data previously acquired, and It is used to identify particular individual.
In embodiments, eyepiece 100 can be with such as bio-identification flash lamp 7300, bio-identification phone 8000, biology Identify the mobile biometric apparatus of camera, pocket biometric apparatus 5400, armband formula biometric apparatus 5600 or the like Associated, wherein the movement biometric apparatus can be used as autonomous device or communicate with eyepiece, such as to equipment control, To the storage of the displays of the data from equipment, data, it is linked to external system, is linked to other eyepieces and/or other movements Biometric apparatus etc..Mobile biometric apparatus may make soldier or other non-soldier's acquisitions or utilize existing bio-identification Data to carry out sidelights on to a certain individual.The equipment can provide tracking, monitoring and acquisition such as including video, speech, gait, The biometric record of face, iris biometric feature or the like.Equipment can provide geo-location label for the data of acquisition, Such as band having time, date, place, data acquisition people, environment etc..Such as using thin film sensor, record, acquisition, mark with And verifying face, fingerprint, iris, latent fingerprint, dive palmmprint, speech, articles in pocket and other identifier witness marking and environment Data, equipment perhaps can capture and record fingerprint, palmmprint, scar, mark, tatoos, audio, video, annotation etc..Equipment is perhaps Wet or dry printed article can be read.Equipment may include camera, for example, with IR illumination, UV illumination etc., with perspective dust, The ability of cigarette, haze etc..Camera can support dynamic range expansion, adaptive defect pixel correction, the enhancing of advanced acutance, geometric distortion Correction, advanced color management, hardware based face detection, video stabilization etc..In embodiments, camera output can be transmitted To eyepiece for being presented to soldier.Depending on requiring, which can accommodate multiple other sensors, all as described herein, packet Include accelerometer, compass, ambient light sensor, proximity sensor, baroceptor and temperature sensor etc..Equipment can also have Have mosaic plating sensor as described herein, thus generate the fingerprint of individual, the simultaneously swirls of multiple fingerprints, palmmprint etc. and The high-definition picture of flow liner.Soldier can more easily acquire personal information using mobile biometric apparatus, such as Document and Media Development utilize (DOMEX).For example, operator is capable of taking pictures and reads mark during interview, registration, inquiry etc. Data or " articles in pocket " (such as passport, ID card, personal document, cellular phone catalogue, photo), acquire biometric data It is paid close attention in person profile Deng, typing, which can be input in the safety database that can search for.In embodiments, can make With most specific image plus being manually entered come archived biological identification data, to realize that partial data captures.Data can be automatic Ground geo-location, between the added-time/date tag, filing be medium to digital archives, such as with local or network distribute it is global only One identifier (GUID).For example, can be in IED(improvised explosive devices) face-image is captured at explosion scene, it can be suicide quick-fried Left iris image is captured at fried scene, latent fingerprint can be extracted from sniper rifle, each is all in different places and time It is acquired with different mobile biometric apparatus, and the mark concern personage from multiple inputs together, such as with locomotive Checkpoint.
The further embodiment of eyepiece can be used for providing biometric data acquisition and result report.Bio-identification number Data are identified according to the visual biological that can be such as facial biometric data or iris biometric data etc, or can be Audio biometric data.Figure 39, which is depicted, provides the embodiment that biometric data captures.Assembly 3900 combines The eyepiece 100 that face is discussed about Fig. 1.Eyepiece 100 provides the interaction wear-type eyepiece including optics assembly.It can also be used and mention For other eyepieces of similar functions.Eyepiece may also be combined with global positioning system ability to allow location information to show and report.
Optics assembly allows user to observe ambient enviroment, including the individual near wearer.One embodiment of eyepiece permits Family allowable is using face-image and iris image or both face-image and iris image or audio sample come in biology subscript Individual near knowing.Eyepiece combines correcting user to the correcting element of the view of ambient enviroment, and also display passes through integrated processing Device and image source are supplied to the content of user.The content that integrated image source will be displayed to user introduces optics assembly.
Eyepiece further includes the optical sensor for capturing biometric data.In one embodiment, integrating optical sensor Device is in combination with the camera being mounted on eyepiece.The camera be used to capture the biometric image of the individual near eyepiece user. By the way that eyepiece to be located in suitable direction, optical sensor or camera are directed toward neighbouring individual by user, this can only lead to It crosses and sees the individual to complete.User may choose whether one or more in face-image to be captured, iris image or audio sample It is a.
The seizable biometric data of eyepiece shown in Figure 39 includes face-image, the Yong Huhong for face recognition The iris image that film identifies and the audio sample for voice identification.Eyepiece 3900 is combined along the support of the right of eyepiece and a left side Multiple microphones 3902 of the endfire array form of both temples setting.Microphone array 3902, which is specially tuned to permission, has height The speech of people is captured in the environment of level environment noise.Microphone can be directionality, can turning to and can change.Microphone 3902 provide the selectable option for improved audio capture, including omni-directional operation or directed beams operation.Directed beams operation User is allowed to record the audio sample from the particular individual by the way that microphone array to be redirect to the direction of target individual.It is adaptive Answer microphone array that can be created, it will allow operator dimensionally to turn in the direction of microphone array, and wherein directed beams can be by reality When adjustment come for on-fixed target maximize signal or minimize interference noise.ARRAY PROCESSING allows to pass through analog or digital hand Section is summed to heart-shaped line element (cardioid element), cuts wherein may be present between omnidirectional and directional array operation It changes.In embodiments, beam forming, array steering, adaptive array processing (speech source positioning) etc. can be by airborne processors It executes.In one embodiment, microphone is perhaps able to carry out 10dB orientation record.
By combining the phased array audio and video tracking for tracking audio and video capture, audio bio-identification is captured Enhanced.Audio tracking allows continuously to capture audio sample when target individual moves in the environment with other noise sources This.In embodiments, the speech of user can be reduced from track, clearer heavy to make it possible to have target individual It is existing, such as distinguishing what has been said, better position tracking is provided, better audio tracking is provided etc..
In order to which to display optics and biometric data acquisition power supply, eyepiece 3900 also incorporates lithium ion battery 3904, it can work more than 12 hours on the basis of single charge.In addition, eyepiece 100 also incorporates processor and consolidates State memory 3906 is for handling the biometric data captured.Processor and memory are configurable to and are used as life Object identification captures any software of a part of agreement or format (such as .wav format) or algorithm works together.
Eyepiece assembly 3900 further embodiment provides the biometric data captured is transmitted in life The integrated communicaton ability of the remote equipment of biometric data is stored in object identification database.Biometric data database interpretation The biometric data of capture explains data, and preparing content on eyepiece for showing.
In operation, it is desirable to capture from nearby observe individual biometric data eyepiece wearer by he Oneself or herself be located so that the individual appears in the visual field of eyepiece.Once in place, user initiates to believe bio-identification The capture of breath.The biometric information that can be captured includes iris image, face-image and audio data.
In operation, it is desirable to capture the wearer of the eyepiece of the audio biometric data from the individual nearby observed By himself or herself it is located so that the individual appears near eyepiece, is specifically proximate to be located in eyepiece temple Microphone array.Once in place, user just initiates the capture to audio biometric information.The audio biometric information is by target The record sample of individual voice forms.Audio sample can identify data together with the visual biological of such as iris and face-image etc It is captured together.
In order to capture iris image, eyepiece is simultaneously located so that optics passes by wearer/desired individual of user's observation Sensor assembly or camera can acquire the image of the biometric parameters of the desired individual.Once capturing, eyepiece processing Device and solid-state memory are ready for captured image to be transmitted to remote computing device for being further processed.
The phase that remote computing device receives the biometric image that transmission comes and will the transmission next image and previously capture The biometric data of same type compares.Iris or face-image come compared with the iris or face-image that previously acquired Determine whether the individual had previously encountered and identified.
Compare once having made, remote computing device just sends the report compared to wearer/user eyepiece, for aobvious Show.The images match that this report can indicate that captured biometric image and previously capture.In such cases, Yong Hujie Receive the report of identity and other identifier information or statistical data including the individual.And the not all bio-identification captured Data determine identity with all allowing non-ambiguity.In such a case, remote computing device provides the report of discovery situation, and can ask User is asked to acquire additional biometric data (may be different types of biometric data), to help to identify and compare place Reason.Visual biological identification data can use audio biometric data supplement as the further auxiliary to mark.
Face-image is captured in a manner of being similar to iris image.Due to the size of acquired image, the visual field must It is bigger.The target that this also allows subscriber station to obtain from the facial biometric data that is just captured is farther.
In operation, user may initially capture the face-image of the individual.However, the face-image may be It is incomplete or uncertain, because what the individual may wear is the clothing or other dress ornaments for having blocked facial characteristics, such as Cap.In this case, remote computing device can request to be captured and transmitted using different types of bio-identification additional Image or data.In the above case said, bootable user obtains iris image to supplement captured face-image.In other realities In example, additional requested data can be the audio sample of the speech of the individual.
Figure 40, which is exemplified, captures iris image for iris recognition.The attached drawing exemplifies the focus parameter for analyzing image And the geographical location including the individual when biometric data captures.Figure 40 further depicts the sample report being shown on eyepiece It accuses.
Figure 41 exemplifies the capture of a plurality of types of biometric datas, is face and iris image in this example.It should Capture can carry out simultaneously, or when the biometric data of the first kind leads to uncertain result according to remote computing device Request carries out.
Figure 42 shows the electricity configuration of multiple microphone arrays included in the temple of the eyepiece of Figure 39.End-fire microphone array is permitted Xu Yigeng big distance carries out bigger differentiation and better directionality to signal.Pass through the transmission that will postpone to be incorporated into rear microphone In line, signal processing is enhanced.The switching from omni-directional microphone to directional microphone is realized in the use of double omni-directional microphones.This allows for institute The audio capture of desired individual carries out better direction finding.Figure 43, which is exemplified, to be depicted with the obtainable orientation of different microphones Property improve.
As shown in the top of Figure 43, single omni-directional microphone can be used.The microphone can be placed on from sound source to set a distance Place, and the acoustic pressure or digital audio input (DI) at microphone will be horizontal in given dB.Single microphone is substituted, multiple words can be used Cylinder or microphone array.For example, 2 microphones can be placed on distance sources twice at a distance, it is 2 apart from the factor, acoustic pressure increases 6dB.It replaces Dai Di can be used 4 microphones, be 2.7 apart from the factor, and acoustic pressure increases 8.8dB.Array also can be used.For example, apart from the factor 4 8 microphone arrays at place can have the DI of 12dB to increase, and can have the DI of 13.2dB to increase in 12 microphone arrays at the factor 5 Add.The figure of Figure 43 depicts a little, these given sound pressure levels from the point generate identical signal level at microphone.Such as figure Shown in 43, the super cardioid microphone of the first order can be used in identical distance, in this example there is 6.2dB to increase, and the second level.It is more A microphone can be arranged with compound microphone array.Substitution captures audio sample, eyepiece using the high quality microphone of a standard Multiple microphones of temple part receiving different characteristics.For example, this can user just generating the bio-identification fingerprint of someone speech for It is provided when capturing and compare in the future.The example that multiple microphones use, which is used with the microphone of cellular phone isolation, to be reproduced The definite electrical and acoustic characteristic of the speech of individual.The sample is stored in database for comparing in the future.If the individual Speech captured later, then previous sample just can be used for comparing, and due to the acoustic characteristic of two samples will match, will It is reported to eyepiece user.
Figure 44 shows using adaptive array and improves audio data capture.By modifying pre-existing be used at audio The algorithm of reason can create the adaptive array for allowing user to turn to the directionality of antenna in three dimensions.Adaptive array processing Allow to position the source of voice, therefore the audio data captured is related into specific individual.ARRAY PROCESSING allow digitally or The simple summation of the heart-shaped element of pair signals is carried out using analogue technique.In routine use, user should be in omni-directional mode Switch microphone between directionality array.Processor allows to execute beam forming, array steering and adaptive array on eyepiece Column processing.In embodiments, audio frequency phase array can be used for the audio tracking to particular individual.For example, user may lock The audio feature code (database such as obtaining in real time or from sound characteristic code) for determining a certain individual in ambient enviroment, with The position of the track individual is without keeping eye contact or their head of user's movement.It the position of the individual can be aobvious by eyepiece Show that device is projected to user.In embodiments, the tracking of certain individual can also be provided by the embedded type camera in eyepiece, Wherein user will not be required holding and the eye contact of the individual or their head of movement to follow.That is, in audio Or in any case of vision tracking, eyepiece perhaps can track the individual in home environment, and user does not need to show finger Show the occurent physical motion of tracking, even when user moves their visual direction.
In one embodiment, it integrates camera and sustainably records video file, and integrated microphone sustainably records audio File.The integrated processor of eyepiece may make can add event tag in the long section of continuous audio or videograph.For example, When occurring event, dialogue, experience or other interested projects, the passive record of whole day can be tagged.Tagging can It is completed by explicitly push button, noise or physics tapping, gesture or any other control technology as described herein.Label It can be placed in audio or video file or be stored in metadata stem.In embodiments, label may include event, it is right The GPS coordinate of words, experience or other interested projects.In other embodiments, label can aim at the GPS day on the same day It is synchronous on time.The trigger of other logic-baseds can also tag to audio or video file, such as with other users, set The proximity relation of standby, position etc..Event tag can be the active event label that user manually triggers, the passive thing occurred automatically Part label (by pre-programmed, by event profile management equipment etc.), by user location triggered position sensing mark Label etc..The event of trigger event label can be triggered by following: sound, landscape, visual indicia, received from network connection, optics Trigger, sound trigger, neighbouring trigger, Trigger of time, geographical space trigger etc..Event trigger can be raw to user At feedback (such as audio tones, visual detector, message), information (such as storage file, document, the entry in list, sound are stored Frequency file, video file etc.), generate information transmission etc..
In one embodiment, eyepiece is used as SigInt(SIGNT) glasses.Use integrated WiFi, 3G or bluetooth One or more of radio, eyepiece can be used for significantly and passively collecting the signal of the equipment and individual near user Information.When particular device ID is in nearby sphere, when specific audio sample is detected, when reaching specific geographic position When setting etc., SIGNT can be automatically collected or can be automatically triggered.
The various embodiments of tactics glasses may include being separately identified or acquiring to biometric information, in safe distance Place pays close attention to personage (POI) using visual biological identification information (face, iris, walking gait) come geo-location, and utilization pair The steady sparse recognition algorithm of face and iris identifies POI for certain.Glasses may include being used as bio-identification computer to connect The display without hand of mouth merges plating and visual biological identification information on an integrated display (with enhancing Target emphasize), and check matching and warning without alert POI.Glasses may include position consciousness, such as display currently and Average speed adds the route and ETA(Estimated Time of Arrival to destination), and preload or record trouble spot and withdraw from Route.Glasses may include the real-time interconnection tracking to blue and red army, with know always you friendly troop where, realize blue Visual separation range between color and red army, and geo-location and their position of real-time sharing are carried out to enemy.With The associated processor of glasses may include the ability of OCR conversion and voice conversion.
Tactics glasses can be used in fight to provide the graphic user interface being incident upon on eyeglass, the graphic user interface Provide a user the augmented reality data in direction and the things about such as following information etc: Team Member's position data, The cartographic information in area, SWIR/CMOS night vision, the vehicle S/A of soldier, for be typically less than 2 meters of accuracy to POI or big Geo-location laser range finder, S/A blue army rang ring, the Domex registration of geo-location are carried out in 500 meters of targets (registration), covering and in real time UAV video are repaired in the battlefield AR.In one embodiment, laser range finder can be 1.55 microns of eye protection laser range finders.
Eyepiece can be all as described herein using GPS as described herein and inertial navigation (as utilized Inertial Measurement Unit) Those of, to provide position and direction accuracy.However, eyepiece can be enhanced using additional sensor and associated algorithm Position and direction accuracy utilizes three axis digital compasses, inclinometer, accelerometer, gyroscope etc..For example, military row It is dynamic to may require the position accuracy bigger than obtained by the GPS, thus can utilized in combination increased with other navigation sensors Add the position accuracy of GPS.
The resolution ratio that tactics glasses can be enhanced is characterized, such as 1280 × 1024 pixels, can be characterized by auto-focusing.
It is getting off and is capturing in the belligerent task of enemy army, winning low-intensity, low-density, the war institute justice of unsymmetric form and do not allow Diction is effective information management.Tactics glasses system is by disoperative data record and to the synthesis picture of Situation Awareness Intuitive tactics show that combining each soldier of ES2(is a sensor) ability.
In embodiments, tactics glasses may include the one or more waveguides being integrated in frame.In certain implementations In example, total internal reflection eyeglass turned over simple eye or eyes/under turn over configuration and be affixed to a secondary Anti-bullet glasses.Tactics glasses may include For omnidirectional's earplug of advanced hearing and protection, and the de-noising suspended microphone for conveying differentiated order on voice.
In certain embodiments, waveguide can have contrast control.Any control technology as described herein can be used to control Contrast processed, ability of posture control, automated sensor control, manually controlling using the controller being mounted on temple.
Tactics glasses may include anti-skidding, adjustable elastic head band.Tactics glasses may include inserting clip correcting lens.
In certain embodiments, total internal reflection eyeglass is affixed to the equipment installed on the helmet, as in Figure 74, and can Including day night VIS/NIR/SWIR CMOS color camera.The equipment utilization " perspective ", the electric light projector image turned over are shown Device allows to obtain uncrossed " visual field " of the weapon to threat and soldier oneself.The helmet is mounted on shown in Figure 74 A On equipment may include IR/SWIR luminaire 7402, it is UV/SWIR luminaire 7404, visible to SWIR full shot 7408, right SWIR object lens (not shown) is visible, transparent checks that pane 7410, iris recognition object lens 7412, laser emitter 7414, laser connect Device 7418 or any other sensor, processor or technology described in eyepiece as described herein are received, such as integrated IMU, Eye protection laser range finder, integrated GPS receiver, the compass for positional accuracy and inclinometer, the sight for changing image See angle match the perspective control of eye position, electronic image stabilization and real time enhancing, stored on-board or long-range storage The threat library etc. accessed by tactical network.The wireless computer worn on body can be with the equipment interconnection in Figure 74.It is mounted on Equipment on the helmet includes to SWIR projector optical device as it can be seen that such as RGB micro-projector optical device.Multispectral IR and UV Imaging helps to recognize fakement or the document through changing.The equipment being mounted on the helmet can use the wireless UWB wrist strap or weapon of encryption Preceding grip controller controls.
In one embodiment, transparent observing pane 7410 could rotate through 180 ° and come and it to project image onto a surface Other people are shared.
Figure 74 B shows the decomposition side view for the equipment being mounted on the helmet.The equipment may include for being mounted on the helmet Left or right side on ambidextrous pedestal.In certain embodiments, two equipment may be mounted to that the left and right two of the helmet On side, to allow binocular vision.The equipment or two equipment can bite into MICH the or PRO-TECH helmet pedestal of standard.
Now, soldier cannot effectively utilize the data equipment in battlefield.Tactics glasses system is combined with low-profile (low Profile) form, lightweight material and fast processor fast and accurately determine with making in battlefield.System Modularized design allows equipment to be efficiently deployed to personal, squad or company, while retaining mutual with any battlefield computer The ability of operation.Tactics glasses system combines the real time communication to data.Using airborne computer interface, operator can be real-time Check, upload or compare data.This, which provides valuable situation and environmental data, can be promptly broadcast to all networkings Personal and command post (CP) and tactical operation center (TOC).
Figure 75 A and 75B depict the exemplary reality of bio-identification and Situation Awareness glasses respectively with front view and side view Apply example.The embodiment may include multiple visuals field biography for biometric information acquisition Situation Awareness and enhancing View user interface Sensor 7502, quick lock in GPS receiver and IMU(include 3 axis digital compasses, the gyro for position and direction accuracy Instrument, accelerometer and inclinometer), for help biometric information capture and aim at 1.55 microns of eye protection laser rangings Instrument 7504, the integrated digital video logger for being stored with two quick flashing SD cards, real-time electronic image stabilization and Real-time image enhancement, Be stored in carried micro SD card or by the threat library of tactical network remote loading, turn over optic metachromatic eyeglass 7508, flexible De-noising suspended microphone 7510 and plus enhancing hearing and protect system 3 axis removable stereographic sound earplugs.For example, this is more A visual field sensor 7502 allows 100 ° × 40 ° of FOV, this can be panorama SXGA.For example, sensor can be VGA biography Sensor, SXGA sensor and the panorama SXGA view that 100 ° × 40 °F OV with suture are generated on the display of glasses VGA sensor.Display can be translucent and have perspective control, and the viewing angle that perspective control changes image comes Match eye position.The embodiment may also include SWIR detection so that wearer see the sightless 1064nm of enemy and 1550nm laser guidance, and can be characterized by the following contents: 256 AES encryptions connections of ultra low power, tactics between glasses without Line electricity and computer, instant 2 times of amplifications, automatic face tracking, face and iris record and have 1 meter of automatic identification range Identification and GPS geo-location.The embodiment may include power supply, 4-AA alkaline battery, the lithium electricity of such as 24 hour duration Pond and rechargeable battery box, computer and memory expansion slot have waterproof and dustproof band.In one embodiment, glasses Including curved holographical wave guide.
In embodiments, eyepiece perhaps can sense laser used in such as battlefield aiming.For example, the biography in eyepiece Sensor is perhaps able to detect the laser of typical military laser transmission band (such as 1064nm, 1550nm).By this method, eyepiece or Whether the position for being permitted to be able to detect them is just being aimed, whether another location is just being aimed, uses laser as aiming at auxiliary The position etc. of spotter.Further, since eyepiece perhaps can sense laser, such as directly or reflectingly, soldier can be not only Detection has been guided or has been reflected into enemy's laser source of their position, and oneself can provide laser source and come in battlefield scene It positions optical surface (such as eyes).For example, soldier can use laser scanning battlefield, and use eyepiece viewing laser reflection return as Pass through the possible position for the enemy that eyes are observed.In embodiments, eyepiece can be continuously to ambient enviroment scanning laser, and root Feedback and/or movement are provided according to the result of detection, such as referred to the audible alarm of soldier, by the vision on eyepiece displayer Show the position etc. that device indicates.
In certain embodiments, compact camera (Pocket Camera) can carry out videograph and capture picture, from And allow operator to record environmental data be set to be stored in size the movement in pocket, light weight, firm life Object identification equipment is analyzed.One embodiment can be 2.25 " × 3.5 " × 0.375 ", and face can be carried out at 10 feet It captures, iris capture is carried out at 3 feet, come with the format of the obedience EFTS and EBTS compatible with any iris/facial algorithm Recording of voice, articles in pocket, walking gait and other identifier witness marking and environmental data.The equipment is designed to pre- The specific image for obeying EFTS/EBTS/NIST/ISO/ITL1-2007 is audited and captures, it is soft to be matched by any bio-identification Part or user interface match and file.The equipment may include HD video chip, the 1GHz processor with 533Mhz DSP, GPS chip, active illumination and pre-quantization algorithm.In certain embodiments, small-sized biological camera (Pocket Bio Cam) can not Inventory is monitored in conjunction with bio-identification, therefore it can be used in all echelons and/or be taken action for police reserve force.Data can quilt Automatically geo-location and add date/time stamp.In certain embodiments, equipment can run Linux SE operating system, full Sufficient MIL-STD-810(military standard 810) environmental standard, and waterproof to 3 feet depths (about 1 meter).
In one embodiment, it can be referred to biological plating equipment for the equipment of fingerprint collecting.Biological plating device includes There are two the transparent platens of bevel edge for tool.Platen is by a pile LED and one or more camera illuminations.Multiple cameras are used and are leaned on Near-earth is arranged and is directed toward the bevel edge of platen.Finger or palm are placed in the pressing of the upper surface on platen and to platen, and wherein camera is caught Catch crestal line pattern.Use frustrated total internal reflection (FTIR) Lai Jilu image.In FTIR, light by the finger that presses platen or The crestal line and valley line of palm are formed by air gap and escape out platen.
Other embodiments are also possible.In one embodiment, multiple cameras are put with the inversion " V " shape of saw tooth pattern It sets.In another embodiment, the image for forming rectangle and being generated using the light directly through side, camera array capture.Light is logical The side for crossing rectangle enters rectangle, and camera is under rectangle, so that camera, which can be captured, passes through the square by light Shape and the crestal line and valley line illuminated.
After image is captured, using software come the image stitching from multiple cameras together.Customization can be used FPGA carry out Digital Image Processing.
Once being captured and handling, image can be streamed to remote display, such as smart phone, computer, hand Hold formula equipment or eyepiece or other equipment.
Above description provides the summary of the operation of disclosed method and device.These and other implementations are provided below The additional description and discussion of example.
Figure 45 is illustrated according to the fingerprint based on optical device of an embodiment and the construction and layout of palmmprint system.Optics battle array Column are made of about 60 wafer-level cameras 4502.4503,4504 use are ambiented lighting using continuous based on the system of optical device In to swirls and flow liner the progress high-resolution imaging for constituting fingerprint or palmmprint.This configuration provides low-profile, light weight, Extremely firm configuration.Durability is enhanced by anti-scratch, transparent platen.
Mosaic plating sensor is supplied images to using frustrated total internal reflection (FTIR) optic panel and is mounted on class PCB Wafer-level camera array on substrate 4505.Sensor can be scaled to any flat width and length, and depth is about 1/ 2".Size can be in the plate of the plate for being small enough to capture just what a finger roll printing to the plating for being large enough to capture both hands simultaneously Variation in range.
Mosaic plating sensor allows operator to capture plating and collected data is compareed on-board data base progress Compare.Data wirelessly can also be uploaded and be downloaded.The unit can be used as separate unit operation or can be with any bio-identification system System is integrated.
In operation, mosaic plating sensor provides high reliability in the adverse circumstances with excessive daylight.In order to This ability is provided, is reduced by using pixel, multiple wafer scale optical sensors are digitally stitched together.It is generated Image is designed to more than 500 points of per inch (dpi).Power supply is parasitic from other sources by battery offer or by using usb protocol Draw power supply in ground.Format obeys EFTS, EBTS NIST, ISO and ITL1-2007.
Figure 46 exemplifies the traditional optical method that other sensors use.It is suppressed complete interior anti-that this method is equally based on FTIR( It penetrates).In the figure, finger contact prism and light is scattered.The light of cameras capture scattering.It is shown as dark by the crestal line of plating on finger Line, and the valley line of fingerprint is shown as bright line.
Figure 47 shows the method used by mosaic sensor 4700.Mosaic sensor also uses FTIR.However, plate is It is to be illuminated from side, and internal reflection is comprised in the plate of sensor.Its image shown in portion is taken on the figures The crestal line contact prism of fingerprint simultaneously scatters light, thus the light for allowing cameras capture to scatter.Crestal line on finger is shown as bright line, and paddy Line is shown as concealed wire.
Figure 48 depicts the layout of mosaic sensor 4800.LED array is arranged around the periphery of plate.It is to be used under plate Capture the camera of fingerprint image.Image is captured on the bottom plate (referred to as capture plane).Plane is captured to be placed in finger Sensor plane thereon is parallel.The quantity of the thickness of plate, the quantity of camera and LED can be according to effective capture region of plate Size and change.The thickness of plate can be reduced by addition reflecting mirror, these reflecting mirrors fold the optical path of camera, to reduce Required thickness.Each camera should cover an inch of space, and some of them pixel is overlapped between camera.This allows mosaic Sensor realizes 500ppi.Camera can have 60 degree of visuals field;However there may be significant distortions in the picture.
Figure 49 shows the embodiment 4900. of the camera fields of view of multiple cameras used in mosaic sensor and interaction Each camera covers small capture region.The region depends on camera fields of view and the distance between camera and the top surface of plate.α is The half of the horizontal field of view of camera, and β is the half of the vertical visual field of camera.
Mosaic sensor can be bonded in the biological phone and tactical computer gone out as illustrated in Figure 50.Biological phone Complete mobile computer framework is used with tactical computer, it combines dual core processor, DSP, 3D graphics accelerator, 3G- 4G, WLAN (according to 802.11a/b/g/n), bluetooth 3.0 and GPS receiver.Biological phone and tactical computer with The encapsulation delivering and the comparable ability of standard laptop of phone size.
Figure 50 exemplifies the component of biological phone and tactical computer.Biological phone and tactical computer assembly 5000 mention It has supplied to include display screen 5001, loudspeaker 5002 and the keyboard 5003 in shell 5004.These elements in biological phone and The front of tactical computer assembly 5000 is visible.Positioned in 3800 behind of assembly being camera for iris imaging 5005, the camera 5006 and biological plating fingerprint sensor 5009 for face imaging and videograph.
In order to provide secure communication and data transmission, equipment combines selectable 256 AES encryptions and COTS sensing Device and the software audited in advance of bio-identification is carried out for obtaining to POI.The software is by any approval for sending and receiving safety The bio-identification of " unabiding " speech, video and data communication matches software to match and achieve.In addition, the biology phone branch Hold Windows Mobile, Linux and Android (Android) operating system.
Biological phone is the handheld device for enabling 3G-4G, for touching Web portal and enabling the monitoring of biometric information Inventory (BEWL) database.These databases allow to carry out the biometric image and data captured live comparison.This sets It is standby to be designed to be suitable for the LBV of standard or pocket.In embodiments, bio-identification phone and tactical computer can be used with The mobile computer framework that the following contents is characterized: dual core processor, DSP, 3D graphics accelerator, 3G-4G, WLAN (802.11a/b/g/n), bluetooth 3.0 allow readable capacitive touch under safe and civilian network, GPS receiver, WVGA daylight Panel type display can export stereo 3 D video, tactile backlight qwerty keyboard, stored on-board, support multiple operating systems etc., it It is designed with light-type and the ability of laptop computer is provided.
Biological phone can search for, acquire, register and verify a plurality of types of biometric datas, including face, iris, double Refer to fingerprint and personal life data.The equipment also records video, speech, gait, mark label and articles in pocket.Mouthful Article includes the various small articles usually carried in pocket, wallet or parcel in bag, and may include such as change, body Part card, passport, credit card and other items.Figure 52 shows the typical set of such information.Pocket is depicted in Figure 52 The example of the set 5200 of interior article.The type for the article that can be included be personal document and photo 5201, books 5202, The document of notebook and paper 5203, such as passport 5204 etc.
Bio-identification phone and tactical computer may include that such as high definition static state and video camera etc are able to carry out life Object identifies the camera of data acquisition and video conference.In embodiments, eyepiece camera and video conference energy as described herein Power can be used together with biological phone and tactical computer.For example, the camera being integrated in eyepiece can capture image and will Image is communicated to bio-identification phone and tactical computer, and vice versa.Exchange data between eyepiece and bio-identification phone, Network connection can be established or be shared by any one.In addition, can be with firm, completely military construction come to bio-identification electricity Words and tactical computer add shell, are resistant to militarization temperature range, waterproof (as deep as 5 meters) etc..
Figure 51 exemplifies using biological phone the embodiment 5100 for capturing latent fingerprint and palmmprint.With 1000dpi with coming from The active illumination of ultraviolet-ray diode covers the fingerprint and palmmprint of scale bar to capture.Life all can be used in fingerprint and palmmprint 5100 Object phone captures.
Using GPS ability, the data acquired by biological phone are stabbed by automatic geo-location and plus date and time.Data It can be uploaded or download and compare airborne or networking database and be compared.Pass through the 3G-4G, WLAN and indigo plant of equipment Tooth ability is convenient for this data transmission.Data input can be completed with qwerty keyboard, or pass through available other methods It completes, stylus or touch screen etc..After using most specific image is acquired, biometric data is archived.Hand Dynamic input allows partial data to capture.Figure 53 shows the digital archives image being maintained at database and bio-identification monitoring inventory Between interaction 5300.The data that bio-identification monitoring inventory be used to capture on the spot are compared with the data previously captured Compared with.
EFTS, EBTS NIST, ISO and ITL1-2007 format can be used to provide one with biometric data for formatting The compatibility of serial and various databases.
The specification of biological phone and tactical computer is given below:
Operating temperature: -22 DEG C to+70 DEG C
Connectivity I/O:3G, 4G, WLAN a/b/g/n, bluetooth 3.0, GPS, FM
Connectivity output: USB2.0, HDMI, Ethernet
Physical size: 6.875 " (height) × 4.875 " (width) × 1.2 " (thickness)
Weight: 1.75 pounds.
Processor: the 3D graphics accelerator of double-core -1GHz processor, 600MHz DSP and 30M polygon per second
Display: 3.8 " WVGA(800 × 480) readable, transflective, capacitive touch screen under daylight, scalable is aobvious Show that output is used for while connecting 3 1080p high definition screens.
Operating system: Windows Mobile, Linux, SE, Android
Storage: 128GB solid state drive
Additional storage: double SD card slots for additional 128GB storage.
Memory: 4GB RAM
Camera: 3 fine definition static state and video camera: face, iris and meeting (face of user)
3D is supported: can export stereo 3 D video.
Camera sensor is supported: dynamic range of sensor extension, adaptive defect pixel correction, advanced acutance enhance, are several What distortion correction, is based on HW(hardware at higher management) face detection, video stabilization
Bio-identification: airborne optical, 2 fingerprint sensors, face, DOMEX and iris camera
Sensor: according to requiring, accelerometer, compass, ambient light sensor, proximity sensor, air pressure be can adapt to The addition of sensor and temperature sensor.
Battery: less than 8 hour, 1400Mah, rechargable lithium ion, hot plug battery pack.
Power supply: the various power supply options for continuous operation.
Software features: face/posture detection, noise filtering, pixel correction.
With the powerful video-stream processor for covering more, rotating and being sized ability.
Audio: onboard microphone, loudspeaker and audio/video input.
Keyboard: there is the full touch qwerty keyboard of adjustable backlight.
Additional equipment and external member are also in combination with mosaic sensor and can be together with biological phone and tactical computer Work, to provide the complete solution on the spot for acquiring biometric data.
One such equipment be in Figure 54 illustrated by small-sized biological external member.The component of small-sized biological external member 5400 includes GPS antenna 5401, biological plating sensor 5402, keyboard 5404, they are included in shell 5403.The biology is described below The specification of external member:
Size: 6 " × 3 " × 1.5 "
Weight: 2 pounds are amounted to
Processor and memory: 1GHz OMAP processor
650MHz kernel
The 3D accelerator per second that up to 18,000,000 polygons can be handled
64KB L2 cache
32 FSB of 166MHz
The embedded PoP memory of 1GB, can be extended to up to 4GB NAND
64GB solid-state hard drive
Display: 75mm × 50mm, 640 × 480(VGA) screen of readable LCD, anti-dazzle, antireflection, anti-scratch under daylight Curtain processing
Interface: USB2.0
10/100/1000 Ethernet
Power supply: battery operation: to register the continuous registration for carrying out about 8 hours in about 5 minutes every time.
Embedded ability: mosaic sensor optical fingerprint reader
Digital iris camera with active IR illumination
Digital face and DOMEX camera (visible) with flash lamp
Quick lock in GPS
Each feature of biological phone and tactical computer also may be provided on the life for biometric data acquisition system In object external member, which is folded in firm and compact shell.Data are to use biometric identification criteria What image and data format acquired, it can be used for by cross reference and Ministry of National Defence's bio-identification authoritative database carries out near real-time data Communication.
Small-sized biological external member shown in Figure 55 can be captured with 1000dpi with the active illumination from ultraviolet-ray diode Cover the latent fingerprint and palmmprint of scale bar.The biology external member possesses 32GB memory storage card, can be with fight radio or meter Calculation machine interoperability comes upload and downloading data under the conditions of Real-time Battlefield.Power supply is provided by lithium ion battery.Biological external member combination The component of part 5500 includes GPS antenna 5501, biological plating sensor 5502, the shell 5503 with substrate 5505.
Biometric data acquisition is readily located geographically mobile to monitor and track individual.The biology external member can be used to acquire They are simultaneously registered in database by fingerprint and palmmprint, iris image, face-image, latent fingerprint and video.For fingerprint and the palm The algorithm of line, iris image and face-image acquires convenient for the data of these types.In order to help to capture simultaneously iris image and Latent fingerprint image, the biology external member have IR the and UV diode of active illumination iris or latent fingerprint.In addition, the small-sized biological set Part also obeys EFTS/EBTS, including ITL1-2007 and WSQ completely.The biology external member meets MIL-STD-810 for extreme It works in environment, and uses (SuSE) Linux OS.
In order to capture image, which, which uses to have, carries out wavefront coded high dynamic range phase to the maximal field depth Machine, so that it is guaranteed that the details in latent fingerprint and iris image is caught in.Once it is captured, Real-time image enhancement software and image Stablize and begins to improve readable and outstanding visual discrimination is provided.
The biology external member can also record video and will be stored in airborne " take the photograph on chip by dynamic (30fps) color video entirely In camera ".
Eyepiece 100 can be docked with movable folding type biometric information Enrollment Kit (i.e. biological external member) 5500, the external member It is the biometric data acquisition system being folded in the shell of Compact robust, so that its opening becomes for as described herein Fingerprint, iris and face recognition, the biometric datas such as latent fingerprint small workstation.Biometric apparatus is moved with other Situation it is the same, movable folding type biometric information Enrollment Kit 5500 is used as autonomous device or related to eyepiece 100 Connection ground is used, as described herein.In one embodiment, movable folding type biometric information Enrollment Kit is can be folded into such as 6 " × 3 " × 1.5 " etc small size, all for example 2 pounds of weight.It may include processor, digital signal processor, 3D acceleration Device hashes (FSB) function, solid-state memory (such as stacked package (PoP)), hard disk drive, display based on rapid verification Readable LCD, anti-dazzle, antireflection, anti-scratch screen under (such as 75mm × 50mm, 640 × 480(VGA) daylight), USB, ether Net, mosaic optical fingerprint readers, digital iris camera (as with active IR illumination), has flash lamp at embedded battery Digital face and DOMEX camera, quick lock in GPS etc..Data can be acquired with biometric identification criteria image and data format, It can be used to carry out near real-time data communication with Ministry of National Defence's bio-identification authoritative database by cross reference.The equipment perhaps can adopt The biometric data of collection concern personage and geographical location are for monitoring and tracking, by using the war with standard networking interface Bucket radio or computer carry out wireless data upload/downloading etc..
Other than biological external member, mosaic sensor can be incorporated into the finger being mounted in wrist as shown in Figure 56 Line, palmmprint, geo-location and POI registering apparatus.Eyepiece 100 can be docked with biometric apparatus 5600, the biometric apparatus Wrist or arm part in soldier are tied up, and folds can open and is used for fingerprint as described herein, iris recognition, computer etc. and gives birth to The biometric data acquisition system of object identification data.The equipment can have readable display under integrated computer, keyboard, daylight Device, bio-identification sensing platen etc., therefore operator can store rapidly and remotely or compare what data were used to acquire and identify Purpose.For example, armband formula bio-identification sensitivity platen can be used to scan palm, fingerprint etc..The equipment can provide concern personage Geo-location label, and acquisition data with having time, date, position etc..With the feelings of other movement biometric apparatus Shape is the same, which is used as autonomous device or is used in association with eyepiece 100, such as this paper institute It states.In one embodiment, biometric apparatus can be small and light, to allow it to be worn comfortably on soldier's On arm, there is 5 " × 2.5 " size such as active fingerprint and palmprint sensor, weight is 16 ounces.It may be present The algorithm captured for fingerprint and palm.The equipment may include processor, digital signal processor, transceiver, qwerty keyboard, The pressure-actuated plating sensor of big wind resistance rain, daylight descend readable transflective QVGA colored backlight LCD display, inside Power supply etc..
In one embodiment, being mounted on the assembly 5600 in wrist includes following elements: band in shell 5601 5602, it is arranged and opens/closes button 5603, the protection cap 5604 of sensor, pressure-actuated sensor 4405 and keyboard With LCD screen 5606.
The fingerprint, palmmprint, geo-location and POI registering apparatus include integrated computer, qwerty keyboard and display.It should Display is designed that and is easily operated under strong daylight, and behaviour is alerted using LCD screen or LED indicator Author has successfully carried out fingerprint and palmmprint captures.The display is improved using transflective QVGA colored backlight LCD screen It is readable.The equipment is light weight and compact, weighs 16 ounces, measures 5 " × 2.5 " in mosaic sensor.The compact size and Weight allows equipment to slide into LBV pocket or tied up on the forearm of user, as shown in Figure 56.Be combined with mosaic sensor Other equipment it is the same, all POI added label in the Shi Douyong geographical location information that is captured.
The size of sensor screen allows ten fingers, palm, four to refer to that clapping print and finger tip captures.The sensor combines greatly Pressure-actuated plating sensor, for as specified by MIL-STD-810 under any weather condition with The rate of 500dpi is quickly registered.Software algorithm support fingerprint trap mode and palmmprint trap mode both, and use (SuSE) Linux OS carries out equipment management.Due to the 720MHz processor with 533MHzDSP, capture is rapid.At this Correctly formatted specific image is consigned to the system software of any existing approval by reason ability.In addition, the equipment also takes completely From EFTS/EBTS, including ITL1-2007 and WSQ.
As other mosaic sensor devices, the wireless 256 AES transceivers of removable UWB is used to make with nothing Ray mode communication is possibly realized.This also provides with the biometric data library that is stored in except equipment carry out safe upload and under It carries.
Power supply is provided using lighium polymer or AA alkaline battery.
The above-mentioned equipment being mounted in wrist can also be used together with other equipment, including be shown with data and video The augmented reality eyepiece of device, as shown in Figure 57.Assembly 5700 includes with lower component: eyepiece 5702 and biological plating sensing Device equipment 5700.Augmented reality eyepiece provide redundancy, eyes, three-dimensional sensor and display, and provide in various illumination Under the conditions of the ability watched, the sun dazzling from the noon intensity level extremely low to night.Utilize the rotation on the temple for being located at eyepiece Turn on pass, the operation of eyepiece be it is simple, user arm computerized or sensor or laptop devices can access data in the past.Mesh Mirror also provides hearing of omnidirectional's earplug for hearing protection and raising.De-noising suspended microphone also can be incorporated in eyepiece, with Better communication to order differentiated on voice is provided.
Eyepiece be able to use the UWB of 256-bit AES encryption come with biological phone sensor and be mounted on setting on forearm Standby wireless communication.This also allows equipment and laptop computer or fight radio communication, and network connection to CP, TOC and Biometric data library.Eyepiece is compatible with ABIS, EBTS, EFTS and JPEG2000.
Similar to other above-mentioned mosaic sensor devices, eyepiece is come using GPS the and RF filter arrays of networking The geo-location of the pin-point accuracy of POI is provided.
In operation, low-profile, the computer that is mounted on forearm and Tactical Display Unit be integrated with face, iris, fingerprint, Palmmprint and finger tip acquisition and mark.The equipment also records video, speech, gait and other distinguishing characteristics.Face and iris Tracking is automatically, so that equipment help be allowed to identify non-cooperation POI.The transparent display provided using eyepiece, operator is also It may be viewed by sensor imaging, mobile map, the superposition application with navigation, aim at, from the position of sensor or other letters The individual or other target/POI of breath, UAV etc., data and the biometric data that is just captured.
Figure 58 show fingerprint, palmmprint, geo-location and POI registering apparatus further embodiment.Equipment is 16 ounces (big About 450 grams), use 5 " × 2.5 " active fingerprints and palmmprint capacitive sensor.Sensor can register ten fingers, hand with 500dpi The palm, four, which refer to, claps print, finger tip plating.0.6-1GHz the processor with 430MHz DSP provides quickly registration and data capture. The hardware compatibility ABIS, EBTS, EFTS and JPEG2000, and with the networking for carrying out high accuracy positioning to concern personage GPS is characterized.In addition, the equipment is logical by 256-bit AES encryption UWB and laptop computer or fight radio wireless Letter.Database information may also be stored in equipment, to allow to compare on the spot without upload information.The on-board data can also With other equipment wireless sharing, such as laptop computer or fight radio.
The further embodiment for being mounted on the biological plating sensor assembly 5800 in wrist includes following elements: biology Plating sensor 5801, wrist strap 5802, keyboard 5803, fight wireless electric connector interface 5804.
Data can be stored in forearm equipment, because equipment can utilize military condition (Mil-con) data storage lid (cap) Lai Zengjia memory capacity.Data input is executed on qwerty keyboard, and the progress that can have gloves on.
Display is configured to transflector QVGA readable in the sunlight, colored backlight LCD display.In addition in strong day Except running under light, equipment can be run in large-scale environment, because equipment meets MIL-STD-810 in extreme circumstances Service requirement.
Above-mentioned mosaic sensor can also be incorporated into movable folding type bio-identification Enrollment Kit, such as institute in Figure 59 Show.The movable folding type bio-identification Enrollment Kit 5900 folds into itself and size is set to be suitable for tactical vest Pocket has 8 × 12 × 4 inches of size when unfolded.
Figure 60 exemplifies how eyepiece and the equipment that is mounted on forearm can be adopted for biometric data to fetching to provide The embodiment 6000 of the holonomic system of collection.
Figure 61 provides the system diagram 6100 for movable folding type bio-identification Enrollment Kit.
In operation, movable folding type bio-identification Enrollment Kit allows user to search for, acquire, identify, verify and step on Remember face, iris, palmmprint, finger tip and the life data of certain an object, and also recordable voice sample, articles in pocket, with And other visible marks label.Once collected, data can be stabbed by automatic geo-location and plus date and time.It can be right The data of acquisition are searched for and compared according to airborne and networking database.In order to the database communication not in equipment, provide Wireless data upload/downloading is carried out using the fight radio or laptop computer with standard networking interface.It formats Submit to EFTS, EBTS, NIST, ISO and ITL1-2007.Image through auditing in advance can be sent directly to matching software, because Any matching and registration software can be used in equipment.
It provides in conjunction with above-mentioned equipment and system for moving the comprehensive of biometric data acquisition, mark and Situation Awareness Close solution.Equipment can acquire fingerprint, palmmprint, finger tip, face, iris, speech and video data, for uncooperative Concern personage (POI) identify.Video is captured using high-speed video, is enabled in the case where unstable It is captured, is such as captured from mobile video.The information captured can be easily shared, and additional data passes through keyboard It is entered.In addition, all data are all coupled with date, time and geographical location label.This is convenient in potential changeable environment In propagate information necessary to Situation Awareness rapidly.Using the more people for being equipped with these equipment, additional data acquisition is also It is possible, to demonstrate the theory of " each soldier is exactly a sensor ".By the way that biometric apparatus and fight is wireless Electricity and battlefield Automated library system are convenient for sharing.
In embodiments, eyepiece can utilize thin film sensor flexible, be such as integrated in eyepiece itself, with eyepiece The external equipment of docking is medium.Thin film sensor may include thin multilayer electromechanics configuration, by unexpected contact force or continuously It generates electric signal when the power of variation.The typical case of electromechanical thin film sensor use to the on-off electric switch of power sensing and Both time resolution sensings.Thin film sensor may include switch, dynamometer etc., and wherein thin film sensor can be dependent on following effect Answer: suddenly electrical contact (switching), under force the gradual change of electrical impedance, under stress the gradually release of charge, The generation etc. of the progressive electromotive force of transconductor when being moved in magnetic field.For example, flexible thin film sensor can be used for power pressing In sensor, with the micro object sensitive pixels for two-dimentional power sensor array.This may for it is following be useful: calculate Machine, smart phone, notebook, similar to the touch screen of the equipment of MP3, especially those equipment with Military Application;For Under the control of the computer control anything (including unmanned plane (UAV), target drone, mobile robot, based on the equipment of ectoskeleton) Screen;Etc..Thin film sensor may be in security application it is useful, such as detect intrusion, equipment, window, dress In the standby remotely-or locally sensor opened or closed etc..Thin film sensor may be to have for trip wire detection , such as together with electronic equipment used in noiseless, long-range trip wire detector and radio.Thin film sensor It can be used for opening-closing detection, the power for detecting the ess-strain in compartment, hull, aircraft target ship etc. senses Device.Thin film sensor is used as biometric sensor, when print, taking palmmprint, fetching point line etc..Film-sensing Device may be for leak detection it is useful, detect groove tank, storage facility of leakage etc..Thin film sensor is in medicine May be in sensor it is useful, when detecting liquid or blood etc. outside body.These sensor applications are intended to pair Thin film sensor can with the illustration that is controlled external equipment and monitored many applications adopted in association by eyepiece, And it is not intended to be limited in any way.
Figure 62 exemplifies the embodiment 6200 of film fingerprint and palmmprint acquisition equipment.The equipment can record four fingerprints and clap print (slap) and roll printing (roll), palmmprint and the fingerprint for reaching NIST standard.Outstanding quality can be captured with wet or dry hand Fingerprint image.Compared with other large sensors, equipment is to reduce in weight and power consumption.In addition, sensor is independent And it is hot-swappable.The configuration of sensor can be altered to accommodate various demands, and sensor can carry out quilt with various shape and size Manufacture.
Figure 63 depicts the embodiment 6300 of fingerprint, palm and registration data acquisition equipment.The equipment records finger tip, rolling Print claps print and palmmprint.Built-in qwerty keyboard allows to input written registration data.As above-mentioned equipment, all data All it has been coupled with date, time and the geographical location label of acquisition.Built-in database provides control built-in database to potential Concern personage airborne matching.Matching can also be executed with other databases by battlefield network.The equipment can with it is above-mentioned Optical bio identification information acquisition eyepiece integrates to support face and iris recognition.
The specification for finger, palm and registering apparatus is given below:
Weight and size: 16 ounces of forearm strap or insertion LBV pocket
5 " × 2.5 " finger print/palm print sensors
5.75 " × 2.75 " qwerty keyboards
3.5 " × 2.25 " LCD displays
One-handed performance
Environment: working under all weather conditions, and -20 DEG C to+70 DEG C
Waterproof: 1 meter up to 4 hours, performance without reducing works
Biometric information acquisition: fingerprint and palmmprint acquisition, mark
For paying close attention to the keyboard and LCD display of personage's registration
Remain larger than 30000 for concern personage carry out airborne matched complete template archives (2 irises, 10 fingerprints, The biometric information of face-image, 35 fields).
The biometric data all acquired is added into time, date and location tags
Pressure capacitance finger print/palm print sensor
30fps high contrast bitmap images
1000dpi
It is wireless: it interoperates completely with battlefield radio, hand-held or laptop computer, 256 AES encryptions
Battery: double 2000 milliamperes of lithium polymer batteries
Fast Charge Battery in greater than 12 hours, 15 seconds
Processing and memory: 256MB flash memory and 128MB SDRA support 3 SD cards, each most 32GB of SD card.
600-1GHz ARM Cortex A8 processor
1GB RAM
Figure 64-66 depicts the use to the equipment for combining the sensor for acquiring biometric data.Figure 64 shows The embodiment 6400 of two-stage palmmprint capture is gone out.Figure 65 shows the acquisition 6500 using finger tip tapping.Figure 66 illustrates acquisition and claps The embodiment 6600 of print and roll printing.
Above discussion is related with the method for biometric data is collected, and such as obtains fingerprint or the palm using platen or touch screen Line, as shown in Figure 66 and 62-66.The disclosure further includes for using polarised light to carry out no touch or contactless print Method and system.In one embodiment, fingerprint using polarized light source and can use the polarization by reflection in two planes by people Light fetches fingerprint image to be acquired.In another embodiment, fingerprint can be taken by people using light source and using multispectral processing Print image is referred to be acquired, two imagers are used such as at two different locations with different inputs.These are different Input may be caused due to using different filters or different sensor/imagers.The technology application may include Bio-identification inspection in the possible problematic situation of the safety of the people checked to unknown personnel or object.
In the method, unknown personnel or object may be close to checkpoint, such as in order to be allowed to further go to his Or her destination.Depicted in system 6700 as shown in figure 67, the people P and suitable physical feeling are (all Such as hand, palm P or other positions) it is illuminated by polarized light source 6701.As well known to the technical staff of optical field, polarization Light source can be simply with the lamp of polarizing filter or other light sources, to issue the light polarized in one plane. Light advances to the people for being located at and being given in the region of non-contact print, so that polarised light is incident on the people P's On finger or other physical feelings.Then incident polarised light is reflected from finger or other physical feelings, and from the people on four sides It transmits from all directions.After the optical element that light has passed through such as lens 6702 and polarizing filter 6703 etc, two imagers Or camera 6704 receives reflected light.Camera or imager can be installed on augmented reality glasses, as above be discussed about Fig. 8 F Like that.
Light then from the palm of concern personage or one or more finger be transmitted to different polarizing filter 6704a, 6704b is then passed to imager or camera 6705.Passed through polarizing filter light can have 90 ° of direction difference (it is horizontal and Vertically) or other direction differences, such as 30 °, 45 °, 60 ° or 120 °.Camera can be with suitable digital imagery sensor by Incident light is converted into the digital camera of suitable signal.Signal is then by the suitable processing of such as digital signal processor etc The processing of circuit 6706.Then signal can be combined such as by the digital microprocessor 6707 with memory in a usual manner.Have The digital processing unit of suitable memory is programmed to generate the number for being suitable for palm, the image of fingerprint or other desired images According to.Then numerical data from imager is combined in this process, for example, using U.S. Patent number 6249616 technology or Other.As described above in the disclosure, then can contrasting data library check " image " of combination to determine the identity of the people.Increase Strong Reality glasses can include this database in memory, or can refer to the signal data in other places 6708 and be compared and examine It looks into.
It is disclosed in the flow chart of attached drawing 68 a kind of for obtaining contactless fingerprint, palmmprint or other biological identification plating Process.In one embodiment, 6801 polarized light sources are provided.In second step 6802, positioning concern personage and selected Physical feeling for by optical illumination.In another embodiment, incident white light may be used rather than uses polarized light source.Work as figure As when being ready to be acquired, light reflects 6803 to two cameras or imager from the people.Polarizing filter is placed in two cameras Each of before camera so that the received light of camera quilt in two different planes (such as horizontal and vertical plane) Polarization 6804.Then each camera detects 6805 polarised lights.Then camera or other sensors convert incident light into is suitable for Prepare image signal or data 6806. finally, image be combined 6807 come formed obviously, reliable plating.As a result It is the image of very high quality, it can relatively identify the people and detection concern personage compared with numerical data base.
It should be understood that other imagers can be used although digital camera is used in the contactless system, it is such as active Pixel imager, cmos imager, with multiple wavelength imaging imager, CCD camera, photodetector array, TFT imager Deng.It should also be understood that other in reflected light can also be used although being used for polarised light to create two different images Variation.For example, substitution uses polarised light, white light can be used, be applied to imager in being different optical filter, such as Bayer is filtered Light device, CYGM optical filter or RGBE optical filter.In other embodiments, polarized light source may be eliminated, on the contrary using nature or White light rather than polarised light.
A period of time is had been developed to the use of no touch or contactless print, is proved in such as previous system Like that.For example, U.S. Patent application 2002/0106115 uses polarised light in contactless system, but require the people for being fetched line Finger enterprising row metal spraying.Such as U.S. Patent number 7651594 and U.S. Patent Application Publication No. 2008/0219522 Described in those of the later system requirements of technology etc contacted with platen or other surfaces.Contactless system as described herein The contact in imaging is not required, does not require to contact in advance yet, and coating or reflectivity are set such as in physical feeling of interest Coating.Certainly, the position of imager or camera relative to each other should be known, in order to more easily handle.
It in use, can be in the inspection of such as compound entrance, building entrance, roadside checkpoint or other convenient locations etc It makes an inventory of place and uses contactless system of fingerprints.This position can be desirable to permit certain people's entrance and refuse other concern personages Into or even detain other concern personages place.In practice, if using polarised light, system can utilize external light source, Such as lamp.It may be mounted to that a secondary augmented reality glasses for the camera of out of contact imaging or other imagers (for a people) Opposite sides on.For example, showing two camera versions in Fig. 8 F, two of them camera 870 is installed on frame 864.? In the embodiment, the software at least handling image can be comprised in the memory of augmented reality glasses.Alternatively, it comes from Camera/imager numerical data can be routed to neighbouring data center suitably to be handled.The processing may include group Numerical data is closed to form the image of plating.The processing may also include check known people database come determine object whether be Pay close attention to personage.
The method of another contactless print non-contactly scans finger and hand using quantum dot laser, with detection The explosive compound and Narcotics compounds of extremely low (parts per billion or even trillion/several) concentration.For example, quantum dot or Other kinds of laser, laser array may be mounted to that in the back of biological phone or the frame of glasses, so as to very It closely but is non-contactly detected, to prevent the pollution between object.As a result, in addition to the glasses or other accessory devices It acquires except biometric data related with iris, fingerprint, face and speech, can also acquire explosive or drugs pollution ID.
Alternatively, but two people everyone use a camera, as seen in the camera 858 in Fig. 8 F.Match at this In setting, two people by relative close so that their respective images will suitably it is similar come by suitable combination of software.Example Such as, two cameras 6705 in Figure 67 may be mounted to that on two secondary different augmented reality glasses, such as two soldier's manipulations one Checkpoint.Alternatively, camera may be mounted to that on the wall or fixed position of checkpoint itself.Two images then can be by having The teleprocessing unit 6707 of memory combines, the computer system at such as building checkpoint.
As discussed above, using the people of augmented reality glasses can by least one of many wireless technologys come Constant contact is mutually kept, especially they are when a checkpoint is on duty.Therefore, from single camera or from two camera versions This data may be sent to that data center or other command posts carry out proper treatment, find the palm followed by database is checked The matching of line, fingerprint, iris line etc..Data center is convenient to be located near checkpoint.Utilize present age computer and storage Availability, provide multiple data centers and wirelessly update software cost by be not these systems prime cost consider because Element.
No touch discussed above or the collection of contactless biometric data can be controlled with several modes, such as this public affairs Open the control technology that middle other places discuss.For example, in one embodiment, user can be by pressing the touch pads on glasses or passing through Voice commands are provided to initiate data collection session.In another embodiment, user can be by hands movement or posture or using originally Any control technology described in text initiates session.Any technology in these technologies may bring up menu, and therefrom user may be selected One option, such as " starting data collection session ", " terminating data collection session " or " continuing session ".If having selected data Session is collected, then the menu of computer control can provide the menu selection about camera quantity, which camera etc., this and user select It is similar to select printer.There is likely to be various modes, polarize optical mode, colour filter mode etc..After each selection, The achievable task of system provides another selection when suitable.May also require user intervention, such as opening polarized light source or Other light sources, using filter or polarizer etc..
After having obtained fingerprint, palmmprint, iris image or data desired by other, menu then can provide pass Which database to be used to the selection compared, which equipment is used to store etc. in.No touch or contactless biometric data are collected System can be controlled by any method as described herein.
Although the system and sensor have obvious purposes in terms of identifying potential concern personage, there is also positive Battlefield use.Fingerprint sensor can be used for transferring the case history of soldier, to quickly and easily be provided immediately about allergy, blood Type and other times sensitivity and the information for determining the data treated, to allow to provide suitable treatment under battlefield conditions. This is particularly useful for patient out of the count in initial treatment and the patient that may have lost identification (RFID) tag.
For from individual capture biometric data equipment further embodiment stored in combination with server and Handle biometric data collected.The biometric data of capture may include the hand image with multiple fingers, palmmprint, face Portion's camera image, iris image, the audio sample of the speech of individual, the gait of individual or the video of movement.The data of acquisition must It must can be accessed to useful.
The processing of biometric data locally or at separated server can be carried out remotely.Processing locality can There is provided capture original image and audio and the information is become when main frame is needed by WiFi or USB link can Option.Alternatively, then another processing locality method processing image simultaneously transmits processed data by internet.It should It finds the step of processing locality includes the steps that the step of finding fingerprint, grading to fingerprint, finds face and then trim Iris and then the step of grading to it and the similar step for audio and video data.Although locally handling data It is required that more complicated code, but its certain the advantages of providing reduced Internet data transfer.
Scanner associated with biometric data acquisition equipment can be used and USB image common in scanner standard The compatible code of device protocol.Different scanner standards can be used in other embodiments, this depends on demand.
When using WiFi network to transmit data, biological plating equipment described further herein can run or show It obtains as to the web server of network.It can be obtained by selecting or clicking on web page interlinkage or button from browser client Each of various types of images.The web server function can be a part of biological plating equipment, specifically by It is included in microcomputer function.
Web server can be a part of biological plating micro-mainframe computer, so that biological plating equipment be allowed to create One webpage, the webpage disclose captured data and also provide certain controls.The additional embodiment of browser application can provide with Lower control: capturing the high-resolution lines of the hand, face-image, iris image, camera resolution is arranged, when the capture of audio sample is arranged Between, and also allow to carry out stream transmission connection using IP Camera, Skype or similar mechanism.The connection can be affixed to Audio and face camera.
Further embodiment provides given by File Transfer Protocol (FTP) or other agreements to the image captured With the browser application of the access of audio.The further embodiment of browser application can be used for automatic with selectable rate Refresh repeatedly to seize preview image.
Additional embodiment provides the processing locality using microcomputer to the biometric data captured, and provides Additional control is to show the grading of captured image, allow user to grade each plating found, fetch institute It the face of capture and fetches pruned iris image and user is allowed to grade each iris line.
Another embodiment provides the USB port with open multimedia application platform (OMAP3) system compatible.OMAP3 is It is a kind of for portable multimedia application chip on proprietary system.OMAP3 device port is equipped with remote network drive Interface specification (RNDIS), it is the specialized protocol that can be used on USB.These systems are provided when biological plating equipment quilt Equipment is shown as the ability of IP interface when being inserted into Windows computer usb host port.The IP interface will in WiFi(TCP/ IP web server) on it is the same.This allows for data to be moved away from micro-mainframe computer, and provides to the aobvious of the plating captured Show.
A kind of application on microcomputer can realize the above by receiving data from FPGA through usb bus.Once It is received, has been created that JPEG content.The content can be written in the socket to server run on laptop computer On, or it is written into file.Alternatively, server can receive socket stream, pop up image, and make its opening in the window, from It and is that each biometric information capture creates new window.If microcomputer operational network file system (NFS) (for A kind of agreement that system or SAMBA based on Sun are used together) (SAMBA is to provide file and printing for Windows client The free software of service is realized again), the file captured can (it be that a kind of PC is logical by operation NFS or System Management Bus (SMB) Letter bus is realized) any client computer share and access.In this embodiment, JPEG reader will show file.Display visitor Family machine may include laptop computer, augmented reality glasses or the phone for running Android platform.
One additional embodiment provides server end application, which provides service same as described above.
One alternate embodiment of server end application shows result on augmented reality glasses.
Further embodiment provides microcomputer on moveable platform, is similar to mass-memory unit or stream transmission Camera.Moveable platform also incorporates active USB serial port.
In embodiments, eyepiece may include for capturing 360 degree of sound and/or view around the wearer from eyepiece The audio and/or visual sensor of feel.This may be from being mounted on the sensor with eyepiece sheet, or is coupled in and is mounted on wearing Sensor on vehicle locating for person.For example, sound transducer and/or camera may be mounted to that outside vehicle, wherein sensor is logical Letter eyepiece is coupled in provide ambient sound and/or to the landscape " view " of ambient enviroment.In addition, the audio system of eyepiece can Sound protection is provided, eliminates, enhance etc., to help to improve wearer's when wearer is surrounded by external or noisy noise Acoustical quality.In one example, wearer can be coupled in the camera being mounted on the vehicle that they are driving.These cameras Then it can be communicated with eyepiece, and provide vehicle periphery 360 degree of views, such as in the figure for being transmitted to wearer by eyepiece displayer It is provided in shape image.
In one example and Figure 69 is referred to, may include using 6902 form of wrist-watch controller in terms of each control of eyepiece Remote control equipment, such as include for when user does not wear eyepiece and eyepiece to fetch carry out information receiving and transmitting and/or control eyepiece Receiver and/or transmitter.Wrist-watch controller may include camera, fingerprint scanner, discrete control button, 2D control panel, LCD screen, for multiconductor control capacitive touch screen, give touch feedback vibration motor/piezoelectric damper, have touching Button, bluetooth, camera, fingerprint scanner, accelerometer of sense etc., the control function region 6904 of such as wrist-watch controller 6902 In or other function part 6910 on provided by.For example, wrist-watch controller can have standard watch display 6908, but additional Ground has the function of control eyepiece, such as passes through the control function 6914 in control function region 6904.Wrist-watch controller can be shown Show and/or otherwise (such as vibration, audible sound) message from eyepiece that notifies user, it is such as Email, wide Announcement, schedule warning etc., and show the content from user currently without the message of the eyepiece of wearing.Vibration motor, piezoelectricity subtract Touch feedback can be provided to touch screen control interface by shaking device etc..Watch Receiver perhaps can be in 6904 user of control function region Virtual push button and click are provided in interface, and hummed and hit the wrist etc. of user when message is received.Eyepiece and hand Traffic connectivity between table receiver can be connect by bluetooth, WiFi, cellular network or any other communication known in the art It mouthful provides.Wrist-watch controller can utilize embedded type camera to carry out video conference (as described herein), iris scan (as remembering The image for recording iris stores in the database, for authenticating etc. together with iris image existing in storage), take pictures, regard Frequency etc..Wrist-watch controller can have all fingerprint scanners as described herein.Wrist-watch controller or it is as described herein any other Haptic interface can measure the pulse of user, can be such as located in watchband, below wrist-watch main body by pulse transducer 6912( Deng).In embodiments, eyepiece and other controls/haptic interface component can have pulse detection, so that coming from different controls The pulse of interface module is monitored in a synchronous manner, for health, activity monitoring, authorization etc..For example, wrist-watch controller Both there can be pulse monitoring with eyepiece, wherein eyepiece can sense whether the two synchronizes, whether all the two previously surveyed by matching The profile (for authenticating) etc. obtained.Similarly, other biological identification information can be used to carry out multiple control interfaces and mesh Certification between mirror, with fingerprint, iris scan, pulse, healthy profile etc., it is that same people wears that wherein eyepiece, which is known whether, Interface module (such as wrist-watch controller) and eyepiece.It can be determined by seeing the IR LED view of skin, seeing under surface pulse etc. Personal biometric information/health.In embodiments, more equipment certifications (such as the token shaken hands for bluetooth) can be used, Use the sensor (fingerprint as the hash of bluetooth token) in such as two equipment having in two equipment.
In one embodiment, wrist-watch controller can have touch screen, even if this is for being not installed on user face in glasses It may be useful that (such as they are in knapsack), which also can control for glasses, in the case where portion.The transparent lens of wrist-watch have OLED display, switchable reflecting mirror is posted in portion under a lens.In other embodiments, wrist-watch controller lens may include electricity Sub- reflecting mirror or electronic ink display.Under any circumstance, lens can be covered on standard analog watchwork, and including can Switched-mirror or electron mirror or the transparent lens of electronic ink display can be activated to display content.Wrist-watch can be used for Ability of posture control is carried out using the integrated sensor of detection posture.Wrist-watch is used as AR label, so that working as the camera of glasses When identifying wrist-watch, an application can be activated.Wrist-watch can be used as with covering virtual image in application as a kind of Physical surface, this actually makes wrist-watch become touch screen interface.
With reference to Figure 70 A-70D, eyepiece can be stored in eyepiece Portable box, such as including rechargeable ability, integrated aobvious Show device etc..Figure 70 A depicts the embodiment for being shown as the box with integrated rechargeable AC plug and digital display of closure, and Figure 70 B shows the same embodiment of box opening.Figure 70 C shows another embodiment of box closure, and Figure 70 D shows opening state The same embodiment, wherein digital display is shown through lid.In embodiments, box can have when eyepiece is located in box pair The ability that eyepiece recharges, such as by AC connection or battery (as interior build in Portable box is used for when far from AC power supplies to mesh The rechargable lithium ion cell of mirror charging).Electric power, which can be connected by wire or wireless, is conveyed to eyepiece, such as passes through box and mesh Wireless induction pad configuration between mirror.In embodiments, box may include the digital display communicated with eyepiece, such as pass through indigo plant Tooth is wireless etc..The display can provide the information about eyepiece state, and message, the battery levels such as received are indicated, notified Deng.
With reference to Figure 71, eyepiece 7120 can be used together with unattended ground transaucer unit 7102, such as should Ground transaucer unit is formed that the stake 7104 on ground 7118 can be inserted by people, is emitted, by RC Goblin by thrown with airplane Deng.Ground transaucer unit 7102 may include camera 7108, controller 7110, sensor 7112 etc..Sensor 7112 may include Magnetic Sensor, sound transducer, vibrating sensor, heat sensor, passive IR sensor, motion detector, GPS, real-time clock Deng, and monitoring is provided at the place of ground transaucer 7102.Camera 7108 can have the visual field in both orientation and elevation 7114, complete or partial 360 degree of camera arrays and ± 90 degree of elevations in such as orientation.Ground transaucer unit 7102 can capture The sensor and image data of event, and it is sent to eyepiece 7120 by wireless network connection.Further, eyepiece is then External communication device 7122, such as cellular network, satellite network, WiFi network can be transferred data to, another eyepiece is transmitted to Deng.In embodiments, data can be relayed to another unit from a unit by ground transaucer unit 7102, such as from 7102A to 7102B arrives 7102C.Further, then data can be relayed to eyepiece 7120B from eyepiece 7120A and to communication Equipment 7122, such as in backhaul data network.It is acquired from ground transaucer unit 7102 or ground transaucer cell array Data can be shared with multiple eyepieces, such as from eyepiece to eyepiece, from communication equipment to eyepiece etc., so that the user of eyepiece can It is utilized using the primitive form of data or post-treated form (graphical display for passing through eyepiece as data) and shares number According to.In embodiments, ground transaucer unit may be not expensive, disposable, toy grade etc..In each embodiment In, ground transaucer unit 7102 can provide the backup to the computer documents from eyepiece 7120.
With reference to Figure 72, eyepiece can provide control by the inside and outside equipment of eyepiece, such as from ambient enviroment 7202, Captured from input equipment 7204, from sensor device 7208, from user action equipment 7210, from inter-process equipment 7212, from interior Portion's multimedia processing apparatus, from internal applications 7214, from camera 7218, from sensor 7220, from earphone 7222, from projector 7224, by transceiver 7228, by haptic interface 7230, from external computing device 7232, from applications 7234, from event And/or data feed 7238, from external equipment 7240, from third party 7242 etc. initiate.The order and control model 7260 of eyepiece It can be by sensing through the input of input equipment 7244, user action 7248, external equipment interaction 7250, event and/or data The reception of feeding 7252, internal applications execute 7254, applications and execute 7258 etc. to initiate.In embodiments, it may be present Series of steps, which is included in, to be executed in control, including at least two combinations in following: event and/or data feeding, sensing Input and/or sensor device, user action capture input and/or output, user's movement for controlling and/or initiating order And/or on movement, order and/or control model and interface (wherein input can be reflected), platform order to can be used defeated to respond The application that enters, from platform interface to the communication and/or connection of external system and/or equipment, external equipment, applications, right The feedback 7262(of user is such as related with external equipment, applications) etc..
In embodiments, event and/or data feeding may include Email, with it is military it is related communicate, schedule is warned Announcement, security incident, financial events, personal event, input request, is indicated, is handed over into active state, into military affairs security event Fight active state, into certain type of environment, into hostile environment, into certain place etc. and their combination.
In certain embodiments, sensing input and/or sensor device can include: charge, black silicon sensor, IR Sensor, acoustic sensor, inductive pick-up, motion sensor, optical sensor, opacity sensor, proximity sensor, Inductance sensor, eddy current sensor, passive infrared proximity sensor, radar, capacitance sensor, capacitive displacement transducer, suddenly That effect sensor, Magnetic Sensor, GPS sensor, thermal imaging sensor, thermocouple, thermistor, photoelectric sensor, ultrasound Sensor, inertia motion sensor, MEMS internal motion sensor, ultrasonic 3D motion sensor, accelerates infrared laser sensor Spend meter, inclinometer, force snesor, piezoelectric transducer, rotary encoder, linear encoder, chemical sensor, ozone sensor, Smoke sensor device, heat sensor, magnetometer, carbon dioxide detector, carbon monoxide detector, oxygen sensor, glucose pass Sensor, smoke detector, metal detector, Raindrop sensor, altimeter, GPS, to whether in external detection, to environment Detection, to movable detection, object detector (for example, billboard), sign detector (for example, for making the geography of advertisement Position mark), laser range finder, sonar, capacitor, optic response, heart rate sensor, RF/ micropower impulse radio (MIR) pass Sensor etc. and their combination.
In embodiments, user action captures input and/or equipment may include head tracing system, camera, speech knowledge Other system, eye-gaze detection system, tongue touch pads, blows the formula of sobbing (sip- at body movable sensor (such as dynamic pickup) And-puff) system, control stick, cursor, mouse, touch screen, touch sensor, finger tracking equipment, 3D/2D mouse, inertia Mobile tracking, microphone, wearable sensor group, robot motion's detection system, Optical motion tracking systems, laser motion with Track system, keyboard, dummy keyboard, the dummy keyboard on physical platform, back border determine that system, activity determine system (such as in train It is upper, aboard, walking, exercise etc.), finger follow camera, virtualization hand in display, symbolic language system, trace ball, It is mounted on camera, the sensor positioned at temple, the sensor positioned at glasses, the Bluetooth communication, wireless communication, satellite communication of hand Deng and their combination.
In embodiments, for controlling or initiating the user's movement or movement of order can include: head is mobile, head is shaken It shakes, nod, head is looped, forehead is twitched, ear is mobile, eyes are mobile, opens eyes, close one's eyes, blink, eyes are turn-taked, hand is mobile, are held Fist, open fist, shake fist, stretch out fist, withdraw fist, voice commands, blown by suction pipe sob, tongue is mobile, finger is mobile, One or more fingers are mobile, elongation finger, bending finger, withdraw finger, stretching, extension thumb, make symbol with finger, use finger Make symbol with thumb, by finger by thumb, with finger drag and drop, touch and drag, touched and dragged, hand with two fingers Wrist is mobile, wrist is turn-taked, wrist is overturn, arm movement, arm elongation, arm withdrawal, arm left rotaring signal, arm right-hand rotation letter Number, with arms akimbo, the elongation of both hands arm, leg be mobile, kicking, leg elongation, leg bending, straddle jump, body movement, walking, race Walk, turn left, turning right, turning round, rotating, both hands arm raising and rotate, an arm is ignored and rotates, with various hands and arm Position rotation, finger is mediated and extension movement, finger mobile (being keyed in as virtual), fiercely attack, tapping, hip motion, shoulder movements, Mobile brush, symbolic language (such as ASL) and their combination are drawn in foot movement.
In embodiments, input can be reflected in order therein and/or control model and interface can include: figure is used Family interface (GUI), audible command interface, the icon that can be clicked, the list that can be navigated, virtual reality interface, augmented reality interface, Head up display, 3D navigation interface, order line, virtual touch screen, robot control interface, is keyed in (such as benefit semi-transparent display With the non-persistent virtual keyboard for being locked in appropriate position), based on prediction and/or study user interface (as study wearer exist " training mode " be what and they when and wherein do), simple command mode (such as start the gesture of a certain application Deng), bluetooth controller, cursor keep, locking virtual monitor, head around positioning cursor mobile etc. and their combination.
In embodiments, the application that order can be used on eyepiece and/or input is responded can include: military affairs are answered With the application of, weapon control, it is military aim at application, war game simulation, simulator of fighting bare-handed, repair manual are applied, tactics row Dynamic application, mobile phone application (such as iPhone application), information processing, fingerprint capture, face recognition, information is shown, information passes Pass, information collect, iris capture, amusement, pilot be easy to get information, in real world with 3D positioning object, with the common people For target, using police as target, teaching, without using hand study course guidance (such as in maintenance, assemble in, first aid it is medium), it is blind People navigation auxiliary, communication, music, search, advertisement, video, computer game, e-book, advertisement, shopping, e-commerce, video Meeting etc. and their combination.
In embodiments, from eyepiece interface to external system and the communication of equipment and/or connection may include microcontroller, Microprocessor, digital signal processor, steering wheel control interface, Joystick controller, movement and sensor resolver, steeper (stepper) controller, audio system controller, integrated sound and the program of picture signal, Application Programming Interface (API), figure Shape user interface (GUI), navigation system control, network router, network controller, reconciliation system, payment system, game are set Standby, pressure sensor etc..
In embodiments, external equipment to be controlled can include: weapon, weapon control system, communication system, bomb Detection system, bomb remove system, Remote Control Vehicle, computer (and many equipment so as to be controlled by computer), phase Machine, projector, cellular phone, tracking equipment, display (such as computer, video, TV screen), video-game, war game mould Quasi- device, moving game, fixed point or tracking equipment, radio or audio system, rangefinder, audio system, iPod, smart phone, TV, entertainment systems, the weapon system of computer control, target drone, robot, automobile instrument panel interface, lighting apparatus (such as shine by mental state It is bright), athletic equipment, gaming platform (such as identification user simultaneously preloads them and likes playing what gaming platform), vehicle, enabling deposit Equipment, payment system, ATM, POS system of reservoir etc..
In embodiments, application associated with external equipment can be Military Application, weapon control is applied, military affairs are taken aim at Quasi- application, the application of simulator of fighting bare-handed, repair manual, tactical operation application, communication, information processing, refers to war game simulation Line capture, face recognition, iris capture, amusement, pilot be easy to get information, in real world with 3D positioning object, with The common people be target, using police as target, teaching, without using hand study course guidance (such as in maintenance, assemble in, in first aid), Blind man navigation auxiliary, music, search, advertisement, video, computer game, e-book, automobile instrument panel application, advertisement, military enemy People's aiming, shopping, e-commerce etc. and their combination.
In embodiments, to wearer with external equipment and using related feedback can include: visual displays, lift Head display, target center or target following display, tone output or audio alarm, performance or grading indicator, score, task are complete Instruction is completed at instruction, movement, content plays, information shows, reports, data mining, recommendation, targeted ads etc..
In one example, may include combination below in terms of the control of eyepiece: soldier's nods to initiate when from movement Silence order (such as in belligerent period) inputs the figure for being reflected in mode therein and/or interface by reflecting to control Using order and/or the Military Application made a response to control input, for from eyepiece interface to outside in user interface, eyepiece Audio system controller of communication and/or connection of system or equipment etc..For example, soldier may pass through mesh in belligerent period Mirror controls safety communications equipment, and it is desirable that changing communication in a certain respect, channel, frequency, code levels etc. do not have to issue Sound and with least movement to minimize a possibility that being audible or seeing.In this example, nodding for soldier head can It is programmed to indicate the change, such as quickly nodding forward indicates and start to transmit, and quickly nods backward and indicates that end passes It is defeated etc..In addition, eyepiece may be used for the graphic user interface of safety communications equipment to soldier's projection, any letter is such as shown Road be it is movable, what alternate channel is available, in their troop currently transmitting other people etc..Soldier's nods Then it can be construed to change order by the processing equipment of eyepiece, which is transmitted to audio system controller, and communicates and set Standby graphic user interface shows the change.Further, certain nod/body kinematics can be interpreted the special life to be transmitted It enables, so that eyepiece is that hear just send pre-established communication without soldier.That is, soldier perhaps can pass through body Body, which is moved, sends preprepared communication (such as determination together with troop before belligerent) to their troop.With this side Formula is worn and perhaps can be connect and be docked with external security device in a manner of secret completely using the soldier of eyepiece equipment, To keep the Silent communication with their troop in belligerent period, even outside the sight of troop.In embodiments, Order therein can be reflected in using other movements or movement, input as described herein for controlling or initiating order And/or on control model and interface, platform can be used order and/or to input make a response application, from platform interface to Communication or connection of external system and equipment etc..
In one example, may include combination below in terms of the control of eyepiece: movement and position sensor are defeated as sensing Enter, augmented reality interface as input can be wherein reflected to soldier order and control interface, motion sensor and weapon The rangefinder of system is used as will acquire the external equipment of information, to the feedback related with external equipment of soldier by control and therefrom Deng.For example, the soldier for wearing eyepiece may use motion sensor to monitor military movement in a certain environment, and when movement passes When sensor is triggered, augmented reality interface can be projected to wearer to help to identify target, people, vehicle etc., for into Row further monitoring and/or aiming.In addition, rangefinder perhaps can determine the distance of object and the information fed back to soldier For aiming at, (such as manually, wherein soldier executes movement of opening fire;Or automatically, wherein weapon system receives information and aims at, Soldier provides firing order).In embodiments, augmented reality interface can provide the information about target to soldier, such as right As on the map that 2D or 3D are projected position, from previously acquired information (such as be stored in object database, including Face recognition, Object identifying) the identity of target, the coordinate of target, target night vision imaging etc..In embodiments, it moves The triggering of detector can be construed to Warning Event by the processing equipment of eyepiece, which can be transmitted to rangefinder to determine object Position, and pass to the loudspeaker of eyepiece earphone and sense movement in just monitored region to provide to soldier The audio-alert of object.The audio-alert of soldier may act as to soldier's plus visual detector it should be noted that the mobile object Input such as found by accessed database known such as when object has been identified as the suspicion object of soldier Soldier, known type of vehicle etc..For example, soldier may monitor sentry post periphery at night at sentry post.In this case, ring Border may be it is dark, soldier is possibly into low state of attention, because may be to the late into the night, and all environmental aspects be peaces Quiet.Eyepiece, which then may act as sentry, enhances equipment, carries out " observation " (certain external prisons with sentry post from the personal visual angle of soldier It is opposite depending on equipment).When eyepiece senses it is mobile when, soldier is alerted immediately and is directed to the position of the movement, distance, identity Deng.By this method, soldier perhaps can react to avoid personal danger, open fire to the mobile aiming of positioning, Yi Jixiang Sentry post alerts potential danger.Further, occur therewith if fought, soldier may be improved due to the warning from eyepiece In the reaction time, better decision made by the information about target, and make to be injured or sentry post is slipped into it is dangerous most Smallization.It in embodiments, can also be using other sensing inputs and/or sensor device, input can be anti-as described herein Reflect wherein order and/or control model and interface, useful external equipment to be controlled, with external equipment and/or outer Portion is using related feedback etc..
In embodiments, eyepiece allows the delivery to such as truck, robot, target drone, helicopter, ship or the like The remote control of tool.For example, wear eyepiece soldier perhaps can be issued by internal communications interface order for control delivery Tool.Delivery vehicle control can mobile (such as soldier be equipped with the mobile biography with eyepiece interactive type communication by voice commands, body Sensor, by eyepiece to fetching control delivery vehicle), keyboard interface etc. provides.In one example, the soldier for wearing eyepiece can Remote control to bomb disposal robot or delivery vehicle is provided, is generated by soldier by the command interface of eyepiece wherein ordering, such as It is described herein such.In another example, soldier can order the aircraft of such as remotely piloted target aircraft etc, remote control tactics to reversely rotate Helicopter etc..Again, soldier can provide the control to remote control aircraft by control interface as described herein.
In one example, may include combination below in terms of the control of eyepiece: wearable sensor group is as soldier's Motion capture input, using robot control interface as input can be reflected in it is therein order and control interface, target drone or Other robot equipment is as external equipment to be controlled etc..For example, the soldier for wearing eyepiece may be equipped with for the army of control With the sensor group of target drone, the movement of target drone is such as controlled using motion sensor input, identifies control using hand to manipulate The controlling feature (for example, the graphic user interface such as by showing through eyepiece) of target drone, is controlled using voice commands input Target drone processed etc..In embodiments, by eyepiece to the control of target drone may include flight control, to airborne inquiry sensor (such as Visible camera, IR camera, radar) control, Threat Avoidance etc..Soldier is perhaps able to use the sensor of installation physically simultaneously By virtual 2D/3D project image description go out practical battlefield, target drone is directed to its scheduled target, wherein flight, camera, Monitoring control is the body kinematics by soldier come order.By this method, soldier be perhaps able to maintain flight to target drone and Personalized, the Full vision of environment immerse, more intuitively to be controlled.Eyepiece can have robot control interface for managing The various controls input for the sensor group worn with reconciliation from soldier, and for providing the interface for controlling target drone.So Target drone can be remotely controlled by the physical action of soldier afterwards, such as passes through the military control to control and management for target drone The wireless connection at center.In another like example, soldier can control bomb disposal robot, which can be worn by soldier The sensor group and associated eyepiece robot control interface worn is controlled.For example, bomb disposal can be provided to soldier The graphic user interface of the 2D or 3D view of environment around robot, wherein sensor group provides soldier (such as arm, hand Deng) the movement for moving to robot conversion.By this method, the Remote Control Interface that soldier is perhaps capable of providing robot comes It is preferably sensitively controlled during careful bomb disposal process.In embodiments, can also using as described herein its His user action captures input and/or equipment, input can be reflected in order therein and/or control model and interface, will quilt The useful external equipment etc. of control.
It in one example, may include combination below in terms of the control of eyepiece: when soldier enters a certain position to soldier Event instruction, based on prediction-study user interface as event occur input can be reflected in order and control therein Mode and/or interface, weapon control system are as external equipment to be controlled etc..For example, eyepiece can be programmed to study scholar What the behavior of soldier such as usually does when soldier enters the specific environment with specific weapon control system, such as wearer Whether open system, equip with arms the system, recall visual displays for the system etc..According to the behavior of the acquistion, eyepiece or Perhaps the prediction what soldier wants in terms of eyepiece control function can be made.For example, soldier may be pushed into combat situation, and It needs to use weapon control system immediately.In this case, eyepiece can when soldier is close to weapon system sensing weapon system Position and/or identity, by weapon system configuration/enabling at soldier close to weapon system when usually how to configure the system, such as The previously used and order weapon control system of the weapon system is come according to last time configuration when eyepiece is in mode of learning Opening system.In embodiments, eyepiece can sense position and/or the identity of weapon system by a variety of method and systems, By vision system, RFID system, the GPS system etc. that identify position.It in embodiments, can be by following come to weapon Control system says the word: providing the graphic user interface of the vision of the control of opening fire of weapon system to soldier, provides choosing to soldier The audio for selecting and carrying out speech recognition to say the word-voice command system interface, the scheduled automatic activation to a certain function Etc..In embodiments, profile associated with the order of this acquistion may be present, wherein soldier can modify the acquistion It profile and/or preference is set in the profile of the acquistion helps to optimize auto-action etc..For example, soldier can be ready with regard to weapon (when i.e. on duty and waiting action) and weapon are effectively fought with enemy with the weapon control profile separated.Soldier may need to repair Change profile to adapt to weapon system variation using associated changing condition, in such as firing order agreement, ammunition type Number, the increased ability of weapon system etc..It in embodiments, can also be using other events as described herein and/or number Order therein and/or control model and interface, useful external equipment to be controlled can be reflected according to feeding, input Deng.
In one example, may include combination below in terms of the control of eyepiece: the personal liability event of soldier (is such as expert at Disposed in dynamic region of war, and manage their time) as event and/or data are fed, voice recognition system is as user Motion capture input equipment, audible command interface can be reflected in order therein and control interface as input, based on video Communication as be used to making a response the input from soldier on eyepiece using, etc..For example, wearing the soldier of eyepiece May obtain be projected to they, visually indicate about the event that is ranked of the communication for supporting groupVideo between commanding officer. Then soldier can be recalled contact details for calling using voice commands to the audible command interface on eyepiece, and be ordered by speech It enables and initiates groupVideo communication.By this method, eyepiece may act as the personal assistant of soldier, to recall the event of being ranked and to soldier The command interface without using hand for executing the event that is ranked is provided.It is communicated in addition, eyepiece can provide visual interface for groupVideo, Wherein the image of other commanding officers is projected to soldier by eyepiece, and wherein external camera is just being mentioned by the communication connection with eyepiece For soldier video image (with camera external equipment, using with the reflecting mirror etc. for being internally integrated camera, such as this Described in text).By this method, eyepiece can provide a kind of fully-integrated personal assistant and based on phone/video communications platform, from And the function of other traditionally separated electronic equipments is included, such as radio, mobile phone, visual telephone, a People's computer, schedule, without with the order of hand and control interface etc..It in embodiments, can also be using as described herein Other events and/or data feeding, user action capture input and/or equipment, input can be reflected in it is therein order and/ Or the application etc. that order can be used on control model and interface, platform and/or input is made a response.
In one example, may include combination below in terms of the control of eyepiece: the security incident of soldier as event and/or Data feeding;Camera and touch screen are as user action capture input equipment;Information processing, fingerprint on eyepiece capture, are facial Identification application is to make a response input;Figure for communication and/or connection between eyepiece and external system and equipment is used Family interface;For accessing external security device and the processing of internuncial external information, fingerprint capture, face recognition application and data Library etc..For example, soldier can receive " security incident " when military checkpoint is on duty, plurality of individual will be carried out safe inspection It looks into and/or identifies., in which case it is possible in the presence of the needs to the biometric information for recording these individuals, because they do not have It occurs in safety database, meet sidelights on of combatant etc. because of suspicious actions, because of them.Then soldier can be used Biometric input device, such as camera for taking pictures to face and the touch screen for recording fingerprint, wherein passing through eyepiece On internal information, processing, fingerprint capture and face recognition application come manage bio-identification input.In addition, eyepiece can provide figure Shape user interface is used as external information, processing, the communication connection that fingerprint captures and face recognition is applied, wherein the graphical user Interface provides data capture interface, external database accessing, concern figure database etc..Eyepiece can provide End-to-End Security management Equipment, including monitoring suspect, the input equipment for obtaining biometric data, display input and database information, outside Portion's safety and the connectivity of database application etc..For example, soldier may pass through military checkpoint scrutineer, and order Soldier to meet sidelights on but not currently exist in anyone in safety database acquire face-image, such as with iris biology Identification information.When individual is close to soldier, such as positioned at will be in the troop by checkpoint, the eyepiece of soldier obtains each individual High-definition picture with carry out face and/or iris recognition, such as by through the addressable database of network communication links into Row checks.If someone is unsatisfactory for sidelights on (such as young child) or is not considered as in the database threat with them Instruction, then the individual can be allowed through checkpoint.If someone has been instructed to threaten or has met sidelights on but not in number According in library, then the individual can not be allowed through checkpoint, and be pulled to side.If they need to be input into secure data In library, then soldier perhaps directly can control external equipment to handle the individual, such as by the equipment of eyepiece or using eyepiece It acquires the personal information of the individual, shoot the face of the individual and/or the close-up image of iris, record fingerprint etc., such as herein It is described.In embodiments, it can also be captured using other events as described herein and/or data feeding, user action defeated Enter and/or equipment, platform on can be used order and/or the application that input is made a response, from platform interface to external system Communication or connection with equipment, for application of external equipment etc..
In one example, may include combination below in terms of the control of eyepiece: finger is mobile as soldier's initiation eyepiece life The user action of order, the icon that can be clicked as user action can be reflected in order therein and control model and/or interface, Application (such as weapon control, army are mobile, information data is fed) on eyepiece, Military Application tracking API are used as and answer from eyepiece The communication and/or connection, external staff for using external system track application, to feedback of army personnel etc..For example, one can be passed through API realizes the system for monitoring soldier to the selection applied on eyepiece, so that the monitoring is provided to the military for monitoring With tracking using the service of situation, based on the behavior that monitors to soldier about the anti-of other application obtained by them Feedback etc..During one, soldier may be selected a certain application and come using and/or download, and such as can click icon by presenting Graphic user interface, and soldier perhaps can be realized equipment based on the mobile control of finger (such as camera or inertia system lead to The finger movement for crossing camera or inertia system soldier is used as control input, be that selection can click icon in this case) it selects Select the icon.Then API can be tracked by the Military Application and monitor the selection, the Military Application track API by the selection or Multiple selections (the storage selection on such as a period of time) of storage are sent to external staff and track application.Choosing of the soldier to application Select, be " virtual click " in this case, can then be analyzed to optimization utilization rate, such as by increase bandwidth, change can With applying, improve existing application etc..Further, which tracks application can determine that wearer is answering using the analysis With the preference of use aspect what is, and with may the recommendation of interested application, preference profile, other similar army to wearer The form of thing user list of application currently in use etc. is sent to wearer to be fed back.In embodiments, guiding ocular is being helped While the military use applied with it, eyepiece can provide service to improve experience of the soldier to eyepiece, such as utilize to soldier Use recommendation that can be benefited from it etc..For example, being that its energy may be fully utilized in the soldier of new hand for eyepiece for using Power, whens using augmented reality interface, organizations, task support etc..Eyepiece can have the utilization rate of monitoring soldier, incite somebody to action The utilization rate and use rate metric (being such as stored in external eyepiece utilization rate equipment) are compared and provide to soldier anti- Feedback is to improve the use to eyepiece and the ability of associated efficiency etc..It in embodiments, can also be using as described herein The other users for controlling or initiating order it is mobile or movement, input can be reflected in order and/or control mould therein It can be used that order and/or the application made a response to input, interface to external system and is set from platform in formula and interface, platform Standby communication or connection, for external equipment application, related with external equipment and/or applications feed back etc..
In one example, may include combination below in terms of the control of eyepiece: the sensors such as IR, heat, power, carbon monoxide are made For input;Microphone is as additional input equipment;Voice commands initiate the movement of order as soldier;Head up display is as input Order therein and control interface can be reflected in;The religion of offer guidance while needs of their hand is used reducing soldier Guidance application is learned, in repairing on the spot, maintenance, assembly etc.;Movement and sensor based on soldier are inputted to be provided to soldier The visual displays of feedback;Etc..For example, the delivery vehicle of soldier may be damaged in fighting, so that soldier is stranded Without instant transport capacity.Soldier perhaps can recall teaching-guiding application, be provided when being run by eyepiece without using The instruction of hand and the access of computer based expertise are come the problem of diagnosing delivery vehicle.In addition, using can provide soldier Unfamiliar step study course such as restores the basic and interim function of delivery vehicle.Eyepiece is also possible to monitoring to be had with diagnosis The various sensors input closed, the sensors such as IR, heat, power, ozone, carbon monoxide, so that sensor input can be taught It learns application access and/or is directly accessed by soldier.Using the microphone that acceptable voice commands can also be provided;For showing instruction letter The head up display of 2D or the 3D description for being repaired part of breath, delivery vehicle;Etc..In embodiments, eyepiece perhaps can It is enough to provide to soldier without the virtual assistant with hand, to help them to diagnose and repair delivery vehicle, to re-establish transport Means, to allow soldier and enemy again belligerent or be moved to point of safes.It in embodiments, can also be using such as this paper institute Other sensing inputs stated and/or sensor device, user action capture input and/or equipment, for controlling or initiating order User is mobile or movement, input can be reflected on order therein and/or control model and interface, platform and order can be used And/or to application, the feedback related with external equipment and/or applications etc. that input makes a response.
In one example, may include combination below in terms of the control of eyepiece: eyepiece enters " active state ", such as " army Thing is belligerent " activity pattern, such as instructed by received task, soldier's order eyepiece enters military fire patterns or mesh Mirror senses it near a certain military activity, may even scheduled or as target belligerent region, this may be It is further developed partially by from the general belligerent appointment for monitoring and learning wearer.Continue the example, into work Dynamic state (such as military belligerent active state) (such as when drive vehicle and enemy meet with or enter hostile territory) can Be combined with following: object-detection device is as sensing input or sensor device, wear-type camera and/or eye-gaze detection system As user action capture input, eyes are mobile mobile as the user for controlling or initiating order or act, 3D navigation circle Face can be reflected in belligerent management application airborne in order therein and control model and/or interface, eyepiece as input and make For for coordinating to order input and the application of user interface, be controlled with external system or equipment communication or the navigation system connecting Device, Navigation for Means of Transportation system are made as external equipment, military planning and the execution equipment to be controlled and/or to be mated with For for handling applications, target center or Target Tracking System about the user action of military affairs instruction as the pass to wearer In the feedback etc. for the chance for aiming at enemy within view while driving.For example, soldier may be in the delivery work for driving them Enter hostile environment when tool, and detects the existing eyepiece in the belligerent region of enemy (such as by GPS, straight by integrated camera Connect and observe target etc.) " military belligerent active state " (such as enabled by soldier and/or ratified) can be entered.Eyepiece is then available Positioning aims at the object-detection device of enemy's chance to detect enemy's delivery vehicle, adverse party residence etc., such as passes through wear-type phase Machine.Further, the eye-gaze detection system on eyepiece can monitor soldier seeing where, and may highlight about pendant The information for watching the target at position attentively of wearer, such as enemy personnel, enemy's delivery vehicle, enemy weapon and You Fang army, Wherein friend and enemy are identified and distinguish.The eyes movement of soldier can also be tracked, such as changing the target of concern, or For ordering input (to indicate the mobile order for indicating that additional information of select command, downward eyes as quickly nodded Deng).Eyepiece can call the projection of 3D navigation interface to help to provide information related with their ambient enviroment, Yi Jijun to soldier The belligerent application of thing such as obtains the input from soldier, provides to 3D navigation interface for coordinating military belligerent movable state Output and external equipment and interface applications etc..Eyepiece can for example using navigation system control come with Navigation for Means of Transportation system Docking, so as to include in military belligerent experience by Navigation for Means of Transportation system.Alternatively, leading for their own can be used in eyepiece Boat system such as substitutes carrier systems or enhances it, such as leave delivery vehicle in soldier and it is desirable that provide ground to them On face when direction.As a part of military belligerent active state, eyepiece can with external military planning and execution equipment interconnection, it is all Such as it is used to provide current state, troop's movement, weather condition, friendly troop position and troops.In embodiments, it is lived by entering Dynamic state, soldier can be provided that feedback associated with the active state, such as military belligerent active state, with The form of the associated information of the target identified provides feedback.In embodiments, can also using as described herein other Event and/or data feedback, sensing input and/or sensor device, user action capture input and/or equipment, for control and/ Or initiate order user it is mobile or movement, input can be reflected in order therein and/or control model and interface, platform Can be used order and/or to input responded application, from platform interface to external system and equipment communication or connection, For external equipment application, it is related with external equipment and/or applications feed back etc..
In one example, may include combination below in terms of the control of eyepiece: secure communication is received as the touching to soldier Hair event, inertia mobile tracking capture input equipment as user action, draw brush movement with what finger drag and drop and soldier were made As the user for controlling or initiating order is mobile or movement, the list that can navigate can be reflected in life therein as input It enables and control interface, information is conveyed as order usable on eyepiece and a type of application made a response to input, tune With system as interface is captured to the communication or connection of external system and equipment, iris from eyepiece and identifying system is used as and is used for External system and the applications of equipment etc..The soldier for wearing eyepiece can receive secure communication, which can enter eyepiece and make For " event " to soldier, such as triggering certain operating mode of eyepiece, with vision and/or aural alert, starting mesh Application or movement on mirror etc..Soldier perhaps can react to event by multiple controlling mechanisms, and such as wearer uses Their finger and hand carries out " dragging and dropping " by intelligent sketching, draws brush etc. (such as to answer by the airborne camera of eyepiece and gesture With, wherein wearer by communication Email or information be dragged into file, using, it is another communication etc.).Wearer may bring up can The list of navigation is as a part for making action to communication.User can be applied from the secure communication by some eyepiece by the letter Breath is communicated to external system and equipment, such as tracking the reconciliation system of communication and associated movement.In embodiments, Eyepiece and/or security access system can require identity verification, and such as by bio-identification authentication, such as fingerprint captures, rainbow Film captures identification etc..For example, soldier can receive secure communication, which is safety warning, and wherein the secure communication is along with arriving The secure link of further information, wherein the soldier is required to provide biometric authentication before being provided access.Once It is certified, which just perhaps can be in their use when the content as obtained by eyepiece is made a response and manipulated Gesture, list, the link, data, image that such as manipulation can be directly obtained from communication and/or the link by being included obtains Deng.The ability for providing response and manipulation content associated with secure communication to soldier can preferably allow soldier not endanger him The mode of any insecure environments that may be presently in interact with message and content.In embodiments, it can also answer Input and/or equipment are captured, for controlling or initiating with other events as described herein and/or data feeding, user action The user of order is mobile or movement, input can be reflected on order therein and/or control model and interface, platform can be used Order and/or the application that input is made a response, from platform interface to the communication or connection of external system and equipment, for outer Application of portion's equipment etc..
In one example, it may include combination below in terms of the control of eyepiece: using inertia user interface dynamic as user The military instruction that work capture input equipment will be supplied to soldier by eyepiece is supplied to external display device.For example, wearing mesh The soldier of mirror may want to one group will be supplied in battlefield from the instruction of their bulletins as obtained by the equipment of eyepiece Other soldiers.The soldier can by using physics 3D or 2D mouse (as with inertia motion sensor, MEMS inertial sensor, Ultrasonic 3D motion sensor, IR, ultrasonic wave or capacitance type touch sensor, accelerometer etc.), virtual mouse, virtual touch Screen, dummy keyboard etc. are helped, to provide the interface for manipulating the content in bulletin.The bulletin can pass through eyepiece quilt Viewing and manipulation, but are also exported in real time, such as to being connected to external display device (such as computer monitor, projector, video Screen, TV screen etc.) outside router.Eyepiece can provide one kind for soldier and make other people that them be watched to pass through as a result, It is that eyepiece is seen and by way of the thing that the control equipment of eyepiece is controlled, to allow soldier that will open with by eyepiece The associated multimedia content of bulletin exports to other non-eyepiece wearers.In one example, mission briefing can be provided that Give battleficld command official, commanding officer perhaps can be made to their troop as eyepiece with the multimedia as eyepiece obtained by with The bulletin of augmented reality resource provides the advantage that obtain these visual resources as described herein.In embodiments, Input and/or equipment, input can also be captured using other sensing inputs and/or sensor device, user action as described herein Can be reflected in order therein and/or control model and interface, from platform interface to external system and equipment communication or Connection, useful external equipment to be controlled, feedback related with external equipment and/or applications etc..
In one example, may include combination below in terms of the control of eyepiece: nod it is mobile as the user that initiates order, For reflect control input be reflected in the graphic user interface of mode therein and/or interface, on eyepiece using order and/ Or the audio that the entertainment applications made a response are inputted to control, eyepiece interface is communicated and/or connect with external system or equipment System controller etc..For example, the wearer of eyepiece may control audio player by eyepiece, and it is desirable that change to next Track.In this example, nodding for wearer can be programmed to instruction track change.In addition, eyepiece may be to wearer Projection is used for the graphic user interface of audio player, such as shows and which track be playing.Nodding for wearer then may be used By the processing equipment of eyepiece be construed to change TRACK command, the order then may be sent to that audio system controller for changing Track, and the graphic user interface for audio player can then show the change of track to wearer.
In one embodiment, may include combination below in terms of the control of eyepiece: motion sensor is as sensing input, increasing Strong reality interface as input can be reflected to the order of wearer and control interface, rangefinder conduct will be controlled and will be therefrom Acquire the external equipment etc. of information.For example, the wearer of eyepiece just may monitor the movement in a certain environment with motion sensor, And when motion sensor is triggered, augmented reality interface can be projected to wearer, and help identifies object.In addition, other Sensor can help to identify, and such as rangefinder is used to determine the distance for arriving object.Augmented reality interface can be provided to wearer About the information of object, position of such as object on the map that 2D or 3D are projected (is such as deposited from the information previously acquired Storage in object database, including face recognition, Object identifying) the identity of object, the coordinate of object, object night vision imaging Deng.The triggering of motion detector can be construed to Warning Event by the processing equipment of eyepiece, the order and then can be transmitted to ranging Instrument determines the position of object, and sends the loudspeaker of eyepiece earphone to and sensed mobile object to provide to wearer Audio-alert.The audio-alert add to the visual detector of wearer may act as it is to wearer, about should be noted that the shifting The input of dynamic object, such as when the object has been identified as wearer's object of interest.
In one example, may include combination below in terms of the control of eyepiece: wearable sensor group is dynamic as user Order therein and control interface, target drone or other machines can be reflected in by making capture input, robot control interface as input Device people equipment is as external equipment to be controlled etc..For example, the wearer of eyepiece may be provided with the sensing for controlling target drone The input of device group, such as motion sensor is (such as logical for the controlling feature for manipulating target drone come movement, the hand identification control for controlling target drone Cross the graphic user interface shown through eyepiece), voice commands input be used to control target drone etc..Eyepiece can have robot to control boundary Face is used to manage and reconcile the various controls input from sensor group, and for providing the interface for controlling target drone.So Can remotely control target drone by the movement of wearer afterwards, such as by the control centre for controlling and managing for target drone, More directly arrive the wireless connection etc. of target drone.In another like example, boundary can be controlled by sensor group and eyepiece robot Face controls robot (such as bomb disposal robot).For example, graphic user interface can be provided to wearer, which is mentioned 2D the or 3D view of the environment around robot is supplied, wherein sensor group provides the movement of wearer (arm, hand etc.) To the conversion of the movement of robot.By this method, wearer is perhaps capable of providing the Remote Control Interface to robot.
In one example, it may include combination below in terms of the control of eyepiece: entering a certain position as the thing to eyepiece Part, based on prediction-study user interface as event occur input can be reflected in it is therein order and control model and/or Interface, entertainment systems are as external equipment to be controlled etc..For example, eyepiece can be programmed to the behavior of study wearer, it is all As wearer enter with entertainment systems room when what usually does, such as wearer whether turn on television set, audio system, Game system etc..According to the behavior of the acquistion, eyepiece perhaps can make what wearer wants in terms of eyepiece control function Prediction.For example, coming into parlor, eyepiece, which senses the position and wearer, would generally pass through entertainment systems when entering room Music is opened, and order entertainment systems open the music of last time broadcasting.In embodiments, eyepiece by a variety of methods and can be System carrys out sensing the feedback of position, by vision system, RFID system, the GPS system etc. that identify position.Saying the word to entertainment systems can It is carried out by following: providing the graphic user interface of selection to wearer, provides selection to wearer and to order Audio-voice command system interface of sound identification, the automatic activation ordered etc..It may be present associated with the order of this acquistion Profile, wherein wearer can modify the profile of the acquistion and/or the preference that is arranged in the profile of the acquistion helps most Optimize auto-action etc..
In one example, may include combination below in terms of the control of eyepiece: personal event is presented as event and/or data Give, voice recognition system as user action capture input equipment, audible command interface as input can be reflected in it is therein Order and control interface, video conference are as application that be used to make a response the input from wearer on eyepiece, etc.. For example, wearer can get visually indicating for the calendar events about a certain Conference Calling for being projected to them.User then can The information of dialling in of the calling is recalled using voice commands to the audible command interface on eyepiece, and initiates to regard by voice commands Frequency meeting.By this method, eyepiece may act as personal assistant, recalls calendar events and provides the nothing for executing calendar events to wearer The command interface of hand need to be used.In addition, eyepiece can provide the visual interface for video conference, wherein other people image passes through mesh Mirror is projected to wearer, and external camera passes through the video image of the communication connection offer wearer of eyepiece.Eyepiece can A kind of fully-integrated personal assistant and phone/videoconferencing platform are provided, thus by other traditionally separated electronic equipments Function be included, mobile phone, PDA, schedule, without with the order of hand and control interface etc..
In one example, may include combination below in terms of the control of eyepiece: security incident is presented as event and/or data It send;Camera and touch screen are as user action capture input equipment;Information processing, fingerprint on eyepiece capture, face recognition is answered For being made a response to input;Graphic user interface for communication and/or connection between eyepiece and external system and equipment; For accessing external security device and the processing of internuncial external information, fingerprint capture, face recognition application and database etc..Example Such as, security official may handle " security incident ", this may be some checkpoint will to many people carry out safety inspection and/or Mark needs to check and/or identify some individual etc., wherein identifying the needs (example to the biometric information of recording individual As because they do not appear in safety database, because of suspicious actions etc.).Then bio-identification input can be used in security official Equipment, such as the touch screen to the camera taken pictures of face and for recording fingerprint, wherein by internal information on eyepiece, Processing, fingerprint capture and face recognition application inputs to manage bio-identification.In addition, eyepiece can provide graphic user interface conduct To external information, processing, the communication connection that fingerprint captures and face recognition is applied, wherein the graphic user interface provides data and catches Catch interface, external database accessing, concern figure database etc..Eyepiece can provide End-to-End Security management equipment, including monitoring It pays close attention to personage, the input equipment for obtaining biometric data, display input and database information, arrive external security and data The connectivity etc. of library application.
In one example, may include combination below in terms of the control of eyepiece: finger is mobile as initiation eyepiece order User action, can click icon as user action can be reflected in it is therein order and control model and/or interface, eyepiece on Application (such as phone application, music searching, advertisement selection), advertisement tracking API as external system is applied to from eyepiece Communication and/or connection, external advertisements are applied, to feedback of user etc..For example, can be realized by an API for monitoring user System to the selection applied on eyepiece, so that the monitoring provides service, based on the behavior monitored to advertisement implantation equipment To the feedback etc. about the possible interested other application of wearer of wearer.During one, wearer be may be selected A certain application comes using and/or downloads, and the graphic user interface of icon can be such as clicked by presenting, wearer perhaps being capable of base Realize that (such as camera or inertia system pass through camera or the finger movement of inertia system wearer to equipment in the mobile control of finger It is used as control input, is that selection can click icon in this case) select the icon.Then it can be tracked by the advertisement API monitors the selection, which tracks API for the selection or multiple selections (choosing stored on such as a period of time of storage Select) it is sent to external advertisements application.Wearer application selection (being " virtual click " in this case) can then be analyzed with Just advertising income is generated, by planting back wearer by advertisement, data being sold to third party's advertising equipment etc..Further, The external advertisements application can determine that wearer in the preference using aspect is using the analysis, and with to wearer May the recommendation of interested application, preference profile, other similar user downloading interested what list etc. form to pendant Wearer sends feedback.In embodiments, while helping to apply by external advertisements is that third party generates ad revenue, mesh Mirror, which can provide, improves service of the wearer to the experience of eyepiece, such as utilize to wearer may interested downloading recommendation, Utilize possible interested stronger advertisement of specific aim of wearer etc..
In one example, may include combination below in terms of the control of eyepiece: body is mobile (such as moving for sensing head movement Force snesor, the camera for sensing hands movement, the dynamic pickup for sensing body kinematics) and touch sensor or audio as user Motion capture sensor device (the game station as sensed such as steering wheel, sword or the like;Sense another player in game;Deng Deng), the mobile user action (such as passing through ability of posture control) as controlling and/or initiating order of head and hand, virtually show Real interface can be reflected in order therein as input and control interface, information show and is used as on eyepiece and can make sound to input The application answered, computer game device are as will be by external equipment that game application controls and to wearer's game replaying Content and performance, grading, score etc. are as to user and external equipment and applying related feedback.For example, wearer or Perhaps can be played together with eyepiece interactive computer game (as on computers, on computer based gaming platform, moving On dynamic gaming platform), wherein the body of wearer is mobile is interpreted control input, such as passes through body movable sensor, touching Touch sensor, infrared sensor, IR camera, visible camera etc..By this method, the movement of wearer's body can be fed into computer Game, rather than more conventional control is used to input, such as portable game controller.For example, eyepiece can pass through the IR on eyepiece And/or visible camera moving through airborne or external gesture recognition algorithm and handle the hand that senses user etc., eyepiece can Sensed by the motion sensor on eyepiece user's head movement (as sense user response in game come jump, come Return is dynamic, mobile from one side to one side) etc..In one embodiment, posture can by downwards camera or by such as by using Camera that folded optics (fold optics) are imaged downwards captures.In other embodiments, camera can capture non-line-of-sight Posture identifies.For example, spill (foviated) or segmented camera can carry out motion tracking and room mapping in eyes front, But there is the quadrant looked down or hemisphere to capture posture and user interface control order, wherein the hand of user is placed on their bodies Side or on thigh it is not parallel with the axis of centres of display.Certainly, gesture can be tracked as described, such as by IMU sensor, magnetic marker, RF label etc. are used in equipment (such as circular finger controls, wrist-watch, pistol grip).These bodies fortune Then dynamic control input can provide a user game ring in virtual reality interface on feed-in eyepiece and information display application The visual depiction in border, can feed-in computer game platform carry out the motion control gaming platform according to user, be supplied to eyepiece and trip Both the virtual reality interface of play platform and information display to create augmented reality gaming platform etc. by eyepiece.In each embodiment In, for sense body it is mobile or otherwise show user's interaction control equipment or Interactive control element can by with mesh The associated processor of mirror is removed from computer picture.In the case where being not intended to sensor to become a part of game, control The all or part of the image of control equipment can be removed from playing in the image that game generates.For example, being only applied to examine in sensor In the case where surveying hand/limbs movement, sensor can be removed from image generation, however be and object for appreciation in sensor or control equipment When the related object of game (such as sword), the object itself, which can be depicted in, to be played in game or AR environment.In embodiments, may be used It can wish that control equipment is viewed in the position except it actually present position.For example, the target that user throws dartlike weapon can It is displayed on the end in passageway before user, rather than the dartlike weapon thrown with user is shown in association.As further Example, if user never actually play game in discharge dartlike weapon, as control equipment dartlike weapon can be illustrated as based on use The feature that family is thrown advances to target.In embodiments, computer game can be used as local game application and completely it is airborne Ground operates on eyepiece, docks with the outside gaming devices of wearer local, and (such as a large amount of multiplayers exist with the game station of networking Line game, MMOG) interface, on eyepiece and the combination etc. that passes through gaming platform.Eyepiece and a local outside gaming devices (such as Gaming platform in wearer family) it docks and in the case where controlling the outside gaming devices, the eyepiece application obscure portions that game executes Visual environment can be provided to wearer and information is shown, and outside gaming devices can provide game application execution.Alternatively, eyepiece It can provide user's moving sensing interface and provide this information to gaming platform, wherein gaming platform is then by the vision of game Interface is supplied to user.Alternatively, eyepiece can provide user's moving sensing interface, and wherein the information is by eyepiece and gaming platform two For person using augmented reality interface is created, which is combined with vision in the game to user is presented Interface and gaming platform.In embodiments, AR application can be as object be by prospect and in building or the side of other structures Enhance advertisement etc. on face.When user drives to pass through, camera can pay attention to object (such as kerbside is along upper lamp stand) just than back The faster speed movement in enhancing surface in scape passes through the visual field.A part that display system can subtract enhanced image carrys out reserved graph As the virtual hierarchy of rear content.This can require the proper calibration of the parallax between the glasses, display and camera of user.In each reality It applies in example, which can be used for generating depth map.Those skilled in the art will be clear that: can be achieved the processing provided by eyepiece with Many different demarcations configuration between the processing provided by external equipment.In addition, game realization can expand to outside across internet Game station, such as using MMOG.External equipment (either local or across internet) then can be provided to wearer Feedback such as provides at least part (sheet being such as combined with the content from external equipment and other players of played content The game projection that ground provides), performance instruction, score, grading etc..In embodiments, eyepiece can provide for computer game User environment, wherein eyepiece and external control input and external processing apparatus are to fetching creation next generation's gaming platform.
As eyepiece by sensor with the direct physical connection of wearer (such as dynamic pickup, body movable sensor, Touch sensor) the mobile substitution of body is detected, eyepiece sensed indirectly in combination with for the body movement to wearer With the active remote sensing system of explanation, by using sensed using IR, sonar, RF, energy projected etc. wearer's hand, The 3D active depth transducer of the position of foot etc..Active 3D depth transducer can be also combined with the visual or IR camera on eyepiece Ground uses.The combination of camera and 3D depth transducer can provide 3D motion capture, and 3D motion capture is processed on eyepiece Advanced gesture recognition is provided.3D active depth transducer may include source (such as IR laser projecting apparatus) and receiving sensor (such as IR biography Sensor).In embodiments, it under camera and 3D active depth transducer can be directed toward relative to eyepiece sight, is directed toward one side, is directed toward Outside, visibility of the Lai Tigao system to hand, the foot of user etc..In embodiments, there may be multiple cameras on eyepiece, such as It is described herein for imaging one or more cameras (a such as face forward, a detection eye motion, one towards Rear), order and control one or more phases of eyepiece function, application, external equipment etc. for sensing the movement of wearer Machine.In one example, the combination of depth transducer and camera may point to image and the movement of the hand of capture wearer, wherein Eyepiece processor tracks hand mobile (translation and rotation of such as hand, hand using the input from depth transducer and camera The movement of each finger), calculate the movement of hand using motion algorithm, and based on the movement detected and according to being detected To the on-board data base of command function of movement control eyepiece function.In embodiments, the hands movement explained can quilt For controlling eyepiece function, eyepiece application, external equipment is controlled by eyepiece, outside game platform is input to, is input to inside Reality-virtualizing game is medium.In embodiments, camera, 3D active depth transducer and associated algorithm can be with airborne words Cylinder or microphone array combine to detect sound and the movement of ambient enviroment.
The disclosure may include a kind of interaction wear-type eyepiece that user wears, and wherein eyepiece includes optics assembly, the light It learns assembly and provides a user the view of sight and introduce optics from integrated image source that the edge of ambient enviroment looks to the front simultaneously The view for the content of assembly shown, the eyepiece provide the collection of the view for the sight that there is the edge of ambient enviroment to look down At camera, for carrying out user's posture identification by integrated gesture recognition device.In embodiments, gesture recognition device can incite somebody to action The motion identification that eyepiece is explained is the order to eyepiece.The movement can be hands movement, arm motion, finger movement, foot fortune Dynamic, leg movement etc..Integrated camera perhaps can check the ambient enviroment towards forward sight, and the sight looked down is used to appearance Gesture identification.The integrated camera can have segmented optical element to be used to simultaneously to the view imaging and opposite direction towards forward sight Under see sight view imaging.It is sensed and is explained indirectly for the body movement to user in addition, the eyepiece can have Active sensing system, wherein the active sensing system is provided along the sight view looked down by optics assembly and is actively believed Number.The active signal can be the active signals such as IR, sonar, RF.The active sensing system may include for sense the hand of user, The 3D active depth transducer of the position of at least one of foot, body etc..The active sensing system can be together with integrated camera one It rises and is used to further provide for user's posture identification.
In embodiments, eyepiece may include the double mode for marking positioning and tracking.It can create general near POI Then the GPS of mark position can create another second label.Second label can be known by sensor reading, image procossing, image Not, user feedback etc. generates.Second label can be used for being tracked between acquisition GPS reading.In embodiments, Second label can be used for providing or leaving the distance of point of interest.Double labelling system can provide a user the distance between two o'clock, Time and direction.Point can be the point of interest of travelling, transport, business, business etc..This is used to mark the double mode of positioning and tracking Permissible user is positioned to the article of purchase, project, Reiseziel, the Transportation Model to be visited etc..Transport item may include using Automobile, train, aircraft, the taxi at family are called a taxi point, taxi, subway etc..Business item may include various projects, such as be not limited to Food, amusement, shopping, clothing, books, service etc..In embodiments, the project of Yao Dingwei can be tourist attractions, restaurant, Park, street etc..There may be the ecosystems of label, from QR code to broad range of communication equipment (router and interchanger) Or passive sensor (RFID label tag that can be checked), all of which may wish to for certain relevant information to be forwarded to glasses, no Pipe is that back-end network is allowed to estimate certain content specific to the position of the exact position or label that deliver what content Itself.Folk prescription may help to orient or triangulation location glasses using two labels, thus with than with certain single markings (especially It is to be not those of vision) easier way provides exact orientation and range information.In embodiments, it can handle Two labels.Eyepiece perhaps can identify two labels nearby or in the visual field, and work to them, simultaneously (such as with In triangulation) or assign priority one of thereto (payment label may be more excellent than non-payment label such as in advertising scenarios First;Label towards safety may than it is AD tagged more preferably etc.).Label may be from glasses, but can also by such as other Glasses or other systems (such as system of advertiser, government side etc.) are placed.
In embodiments, a kind of system may include the interaction wear-type eyepiece that user wears, and wherein eyepiece includes user Optics assembly for the content observing ambient enviroment and showing, the integrated figure for content to be introduced to optics assembly Image source and the integrated processor for reading the label of generation point of interest based on GPS and label being stored in the memory of eyepiece, Wherein integrated processor creation the second label relevant to the GPS point, and in memory by the second label storage.In each reality It applies in example, which can pass through sensor reading, image procossing, image recognition, user feedback to current location etc. At least one generate.Second label can be used for calculating at least one of distance, direction and the time for arriving point of interest.? In each embodiment, point of interest can be at least one of tourist attractions, restaurant, park, street etc..In embodiments, should GPS point can be used together with the second label, to provide at least one of the distance, direction and time of a certain business item etc.. In embodiments, which can be used together with the second label, come provide the distance of a certain Transportation Model, direction and when At least one of between, the tracking etc. to a certain Transportation Model.In these embodiments, Transportation Model may include train, subway, At least one of automobile etc..In each embodiment of system, which can be used together to provide to interest with the second label The tracking of point.In embodiments, which can be used together to provide the tracking to a certain business item with the second label.? In each embodiment, user feedback can be the Oral input to eyepiece.Second label can be generated by various means.For example, Believe the position that second label can obtain the main body of image based on the processing of the static state and/or video image captured to eyepiece Breath.In embodiments, the second label can be based on the number obtained from internet hunt, scanning QR code, bar code, object etc. According to.
In various embodiments, eyepiece may include for providing the earphone of enhancing hearing, and wherein user can hear his Ambient enviroment, there are also additional audios.This audio may include game content, sports commentary etc..In embodiments, microphone And/or earplug can binaural or otherwise play audio.In embodiments, bone conduction can be used together with eyepiece Earphone.These earphones allow user to obtain the audio wave for being transmitted to inner ear (thus the ear-drum for bypassing user) by cranium. In embodiments, which can be used together with the cheekbone of ear front for being placed exactly in user, or can be transmitted sound Other bones of frequency.Therefore, user is aloowed monitoring or to hear sound while cognition is to his or her ambient enviroment Frequently.In embodiments, sound laser also can be used in earphone, and whereby by the use to laser, earphone issues sound wave.In each reality It applies in example, the raising volume for allowing user to experience the sound that external voice or earphone generate and/or clear also can be used in earphone The equipment of degree.In various embodiments, the earphone of eyepiece can play and/or transmit audio from radio, wirelessly obtains Audio, pass through audio etc. that internet obtains.In embodiments, eyepiece can also send satellite broadcasting audio to user.
In various embodiments, eyepiece may include the RF shielding to other of brain or user's body position.In each implementation In example, emit the eyepiece of electromagnetic field any part can made of conductive or magnetic material or other materials barrier shield. In embodiments, barrier may include sheet metal, metal mesh, foam metal, foamed material etc..Hole in shielding or grid Wavelength than the radiation being blocked or other radiation is much smaller.In embodiments, eyepiece inside or other parts and/or mesh Mirror shell is coatable to have metallic ink or other materials to provide shielding.Such metal can be using very small particle form Copper, nickel etc..Such metal can be sprayed-on on the shell.In a further embodiment, this RF shielding can be worn by user Wear other positions to prevent various frequencies from touching his or her brain, eyes or body.
In embodiments, the interaction wear-type eyepiece that a kind of user wears may include user be used to watch ambient enviroment and The optics assembly of the content shown, for by content introduce the integrated image source of optics assembly, radio component and Shielding, the wherein radiation-curable electromagnetic radiation of radio component, and the radiation for shielding a part of barrier from eyepiece goes out eyepiece. In a further embodiment, shielding, which can be positioned so that, protects users from radiation.In addition, shielding can be positioned so that protection user With other people not rayings.In embodiments, shielding can at least shield brain, a certain position of user's head, use of user Another position of family body etc..In embodiments, shielding can be by conductive material, magnetic material, sheet metal, metal mesh, net At least one of lattice and foam metal are constituted.In the embodiments described herein, shielding may include the wave than specific radiation Long small hole, and hole is small than the wavelength of the radiation given off from eyepiece.In embodiments, shielding may include metal At least one of ink, copper ink, nickel ink etc..In embodiments, it shields coatable inside eyepiece.In embodiments, shield It covers at least one of forehead portion of temple, eyepiece that may be disposed at eyepiece etc..In embodiments, shielding can be set It sets at least one of temple, forehead portion of eyepiece of eyepiece etc..In embodiments, shielding can be worn by user.
In one example, the controlling party face of eyepiece includes combination below: IR as input, heat, power, ozone, The sensors such as carbon monoxide;Microphone as additional input equipment;Order is issued as the movement made by wearer Voice command;Head-mounted display as the order and control interface that are reflected in wherein input energy;Instruction guidance application To provide the needs that guidance simultaneously reduces the hand using them simultaneously, such as in maintenance and assembling;According to the movement of wearer and Sensor input to provide the visual display etc. of feedback to wearer.For example, Motor Vehicle Technician can be the wearer of eyepiece, wherein Technician is just assisting the maintenance of vehicle using eyepiece.The instruction guidance application such as run by eyepiece can asking on diagnosis vehicle Slip out of the hand formula instruction and the access of computer based expertise are provided when topic to technician.In addition, the application can provide technician institute not The guide of known program.Eyepiece can also monitor and diagnose with security-related various sensors input, such as IR, heat, power, The sensors such as ozone, carbon monoxide, so that sensor input can be instructed to application access and/or can directly be accessed by technician. The application can also be provided can received microphone by its voice command;It is used to indicate the display of information, vehicle is in and repairs The head-mounted display that the 2D or 3D of part in reason describe;That repairs timely feedbacks and spends.In embodiments, eyepiece It can be provided to technician and slip out of the hand formula virtual assistance to assist them in the diagnosis and repairing of vehicle.
In one example, may include combination below in terms of the control of eyepiece: eyepiece enters " active state " (such as " shopping " activity pattern), for example, user command eyepiece enters shopping mode or eyepiece senses it just adjacent to shopping area Domain, the interested shopping area of the wearer perhaps obtained even through preference profile can be partially by self The shopping preferences of monitoring and study wearer are come by further perfect.Continue this example, activity is such as entered when driving State (such as shopping activity state) can be combined with following: object-detection device is as sensing input or sensing equipment, wear-type Camera and/or gaze detection system are mobile as the use for controlling or initiating order as user action capture input, eyes Family is mobile or movement, 3D navigation interface as the order and control model that are reflected in wherein input energy and/or interface, in eyepiece Upper airborne E-business applications are as the application and external system and equipment communication for coordinating order input and user interface Or the navigation system control of connection, Vehicular navigation system are as will be controlled and/or interfaced external equipment, advertisement machine Structure is used as applications, target center or the Target Tracking System for handling user action about advertising database to wearing Feedback of the person when driving about the shopping machine meeting in sight.For example, wearer can enter shopping area when driving their vehicle Domain, and (being directly viewable target etc. by GPS, via integrated camera) in the appearance for detecting shopping area, eyepiece can Into " shopping activity state " (such as enabled by wearer and/or ratified).Then available objects detector such as passes through eyepiece Wear-type camera positions notice board, the StoreFront etc. of shopping machine meeting to detect.In addition, the gaze detection system on eyepiece can monitor pendant Wearer just looking at where and about the target stared in wearer at position possible prominent information, supply in such as current shop The merchandising answered or Special Events.The eye movement of wearer can also be tracked, such as changing interested target or For ordering input (indicating the mobile instruction of select command, downward eye for the order etc. of additional information for example, quickly nodding). For example, the iris or retina of user can be tracked to provide control input.Eyepiece can call the projection of 3D navigation interface to assist It provides to wearer about the information around them and the E-business applications for coordinating shopping activity state, such as takes The input from wearer is obtained, output is provided to 3D navigation interface, is docked etc. with external equipment and application.Eyepiece can be such as It is docked using navigation system control with Vehicular navigation system, and thus can include into shopping body by Vehicular navigation system In testing.Alternatively, it is such as come out from vehicle as wearer and it is desirable that eyepiece can be used when having the direction of travel for being supplied to them The navigation system (such as enhancing instead of Vehicular system or to it) of own.As a part of shopping activity state, mesh Mirror can be docked with external advertisements mechanism, and all Tathagata provides the current preferential, Special Events for surrounding businessman, pop-up advertisement Deng.External advertisements mechanism can also be connect with third party advertiser, publisher, businessman's organization of supply etc., they can wear to being supplied to The information of wearer is made contributions.In each implementation mutual benefit, by entering active state, wearer be can be provided that and active state phase Associated feedback is such as provided with the anti-of the form of information associated with the target of mark for shopping activity state Feedback.
In one example, may include combination below in terms of the control of eyepiece: the Email as trigger event receives, The inertia mobile tracking of input equipment is captured, as the user's movement or movement for controlling or initiating order as user action The drag and drop for using finger to make and sliding it is mobile, as the column that navigate for the order and control interface being reflected in wherein input energy Table orders as that can use on eyepiece and transmits, the information for the application type that input makes a response as the interface from eyepiece To the payment system of communication or the connection of external system and equipment, as the iris for external system and the applications of equipment Capture and identifying system etc..For example, wearer can receive bill via e-mail, and Email can be used as to wearer's " event " enters eyepiece, and all Tathagata triggers the operation mode of eyepiece using visual and/or audible alert to start eyepiece On application etc..Wearer can react to email event by multiple controlling mechanisms, and such as wearer passes through hand (for example, by airborne camera on eyepiece and hand gesture application, wherein wearer is by Email or electricity at portion posture interface Information in sub- mail is dragged into file, using, another Email etc.) use finger and hand " drag and drop ", sliding etc..Wearer can The bill list that can be navigated is called to pay.User can via eyepiece application by the information from Email (for example, bill Information, account number, amount of money of payment etc.) it is transmitted to external system and equipment, such as paying the payment system of bill.Each In embodiment, eyepiece and/or payment system can require authentication, such as be caught by bio-identification authentication, such as fingerprint It catches, iris captures identification etc..
It in one example, may include using inertia user interface to capture input as user action to set in terms of the control of eyepiece For the combination come by eyepiece to external display device offer instruction.For example, wearer can wish from can by the equipment of eyepiece Demonstration provides instruction to one group of individual.Wearer can be by using physics 3D or 2D mouse (for example, having inertia motion to pass Sensor, MEMS inertial sensor, ultrasonic 3D motion sensor, accelerometer etc.), virtual mouse, virtual touch screen, dummy keyboard etc. To assist the interface to provide for the operating content in demonstration.Demonstration can be being checked by eyepiece and can pass through eyepiece Manipulation, but can also export in real time, such as to being connected to external display device (for example, computer display, projector, aobvious Display screen, video screen etc.) outside router.Eyepiece can be provided to wearer as a result, allows other people to check that wearer passes through mesh What mirror sees and the control facility of eyepiece come by way of controlling, thus the more matchmakers for allowing wearer that will enable by eyepiece Body event exports to other non-eyepiece wearers.
It in one example, may include using event/data feeding and sensing input/sensor device in terms of the control of eyepiece Combination, such as wherein the additional acoustic sensor of security incident can be implemented.There may be sent to the safety warning of soldier simultaneously And acoustic sensor is used as input equipment to monitor the direction etc. of the voice content in ambient enviroment, artillery fire.For example, safe police All army personnels being broadcast in specific region are accused, and by alarm, eyepiece activation monitors embedded acoustic sensor The application of array, the sound of embedded acoustic sensor array analysis sound with the type that identifies sound source and sound from Direction.In embodiments, as described in this, other events and/or data feeding, sensing input and/or sensing equipment etc. It can also be applied.
In one example, may include in terms of the control of eyepiece captured using event/data feeding and user action input/ The combination of equipment, such as requesting input is additional to use camera.Soldier can be located at interested position and be sent to coming from The request of the photo or video of their positions is such as wherein requested with the instruction for taking pictures to what.For example, soldier Just in inspection station, and in some central command institute, a concern individual is determined may attempt to through the inspection station.Central command Then can provide instructions to record and upload image and video to the eyepiece user of the neighbouring checkpoint, this is in embodiments It can be performed automatically, camera must be manually turned on without soldier.In embodiments, as described in this, other things Part and/or data feeding, user action capture input and/or equipment etc. and can also be applied.
In one example, may include in terms of the control of eyepiece such as when soldier enter " active state " and they by hand Posture is for being controlled or being initiated the combination of order using event/data feeding and user's movement or movement when controlling.Soldier can Into being ready in the active state belligerent with enemy, and soldier using hand gesture in belligerent command and control environment Voicelessly order eyepiece.For example, soldier can abruptly enter the hostile area as determined according to the new information received, the new information Eyepiece is placed in the alarm condition of promotion.In this case, it is desirable that noiseless may be a kind of demand, and so eyepiece is converted To hand posture order mode.In embodiments, as described in this, other events and/or data feeding, for control or User's movement or movement for initiating order etc. can also be applied.
It in one example, may include anti-using event/data feeding and input energy wherein in terms of the control of eyepiece The combination of the order/control model and interface reflected, such as user into a type of environment and virtual touch screen.Soldier Weapon system region can be entered, and virtual touch screen can be used to one for being controlled weapon system to wearer Point.For example, soldier enters weapon vehicle, and detect the presence of weapon system and soldier is authorized to use the eyepiece void of weapon Quasi- touch screen recalls virtual firepower control interface.In embodiments, as described in this, other events and/or data feeding, The order and/or control model and interface etc. being reflected in wherein input energy can also be applied.
It in one example, may include using life using the energy on event/data feeding and platform in terms of the control of eyepiece The combination for the application for enabling/making a response to input is such as accessed for the security incident for pilot with the simplicity to information Combination.Squadron pilot (or people of some flight check for being responsible for UAV) can before aircraft takeoff they Security incident notice is received when close to the aircraft, and may bring up an application makes them pass through preflight check.For example, target drone Expert is prepared for starting it close to target drone, and an interactive checking process is displayed to soldier by eyepiece. In addition, communication channel can open the driver of target drone, therefore they can be included in preflight check.In each embodiment In, as described in this, order can be used in other events and/or data feeding, platform and/or is answered what input made a response It can also be applied with equal.
It in one example, may include using event/data feeding and the interface to outside from platform in terms of the control of eyepiece The combination of communication or the connection of system and equipment, such as soldier in-position and graphic user interface (GUI).Soldier can enter Wherein they are required the position interacted with external equipment, and wherein external equipment is docked by GIU.For example, soldier Into military vehicles, and to soldier present GUI, the GUI expansion indicate to the user that transport different phase they need Do what interactive interface.In embodiments, as described in this, other events and/or data are fed, from platform Interface can also be applied to the communication or connection of external system and equipment etc..
It in one example, may include using event/data feeding and useful outside to be controlled in terms of the control of eyepiece The combination of equipment, such as provided instruction and weapon system.The feeding of instruction or instruction can be provided to soldier, wherein extremely A few instruction is about the control to external weapon system.For example, soldier can operate piece of artillery, and eyepiece is not only to him Information in performance associated with weapon and program is provided, feeding, the correction etc. of instruction associated with aiming are also provided. In embodiments, as described in this, other events and/or data feeding, useful external equipment to be controlled etc. can also It is applied.
It in one example, may include using event/data feeding and useful external equipment application in terms of the control of eyepiece Combination, such as in security incident/feeding and biometric feature capture/identification.Soldier can be by (such as passing through safe feedback Send) it is notified by transmission security incident to capture the biometric feature (fingerprint, iris scan, gait profile) of particular individual, Middle biometric feature (is such as provided from the server based on safe Military Network/cloud by the application of external biometric feature ) by storage, assessment, analysis etc..In embodiments, as described in this, other events and/or data feeding, outside are set Standby application etc. can also be applied.
It in one example, may include using event/data feeding and to soldier's and external equipment in terms of the control of eyepiece With the combination of the related feedback of application, such as into active state and soldier is provided the display of information.Soldier can be by eyepiece It is placed in and enters active state, for military buildup, preparation, take action, debrief, and as to being placed in into living The feedback of dynamic state, soldier receives to be shown about the information of the state entered.For example, soldier enters the Clustering Phase of task, Wherein eyepiece grabs the information of a part of must completing during assembly as soldier for task from remote server, including solid Locking equipment, additional training etc..In embodiments, as described in this, other events and/or data feeding and external equipment And/or related feedback of applications etc. can also be applied.
It in one example, may include defeated using sensing input/sensing equipment and user action capture in terms of the control of eyepiece Enter/the combination of equipment, such as utilizes inertia motion sensor and head tracing system.The head movement of soldier can be by eyepiece Inertia motion sensor track, for the control of nodding of eyepiece, visual direction sensing of eyepiece etc..For example, soldier can be just Weapon system is aimed at, and eyepiece senses the gaze-direction on soldier head by inertia motion sensor to provide force The lasting aiming of device.It in addition, weapon system may be in response to the gaze-direction of soldier and constantly move, and is thus constantly pair Target, which is opened fire, to be ready.In embodiments, as described in this, other sensing inputs and/or sensing equipment, user action Capturing input and/or equipment etc. can also be applied.
It in one example, may include using sensing input/sensing equipment and for controlling or initiating in terms of the control of eyepiece User's movement of order or the combination of movement utilize the movement such as optical sensor and eye closing, blink.The state of soldier's eye It can be sensed by the optical sensor being included in the optical train of eyepiece, such as controlling mesh using eye movement Mirror.For example, soldier can aim at their rifle, wherein rifle has the energy opened fire by the control command from eyepiece Power is (such as in the case where sniper, wherein initiating order by eyepiece can reduce due to caused by manually cocking Error in aiming).Soldier can be then by by detecting that predetermined eye is mobile (such as in the commanded profile being retained on eyepiece In) optical sensor issue order weapon is opened fire.In embodiments, as described in this, other sensing inputs And/or sensing equipment, user's movement for controlling or initiating order or movement etc. can also be applied.
It in one example, may include using sensing input/sensing equipment and input energy quilt wherein in terms of the control of eyepiece Order/control model of reflection and the combination at interface such as utilize proximity sensor and robot control interface.It is integrated in mesh Proximity sensor in mirror can be used for sensing the proximity of soldier opposed robots control interface to activate and enable machine The use of device people.For example, soldier moves towards bomb detection robot, and robot automatically activate and is initialized and is directed to that this is specific The configuration (for example, being configured for soldier's preference) of soldier.In embodiments, as described in this, other sensing inputs and/ Or sensing equipment, the order that is reflected in wherein input energy and/or control model and interface etc. can also be applied.
It in one example, may include being ordered using can be used on sensing input/sensing equipment and platform in terms of the control of eyepiece The combination for the application for enabling/making a response to input, such as utilizes audio sensor and music/acoustic application.Audio sensor can Monitoring ambient enviroment sound simultaneously starts and/or adjusts the volume of music, ambient enviroment sound, sound and eliminate etc. to help to fight not Desired ambient enviroment sound.For example, soldier be loaded onto means of transport and the means of transport engine it is initial when be closed. During this time, in addition to rest, soldier may not have other tasks, so they open music to help them to rest.When When the engine start of transporter, music/acoustic application adjusts volume and/or the additional sound of starting eliminates audio to help sound Happy input is maintained at opened with engine before.In embodiments, as described in this, other sensing inputs and/or sensing Order can be used in equipment, platform and/or the application etc. that input makes a response can also be applied.
It in one example, may include using sensing input/sensing equipment and the interface from platform in terms of the control of eyepiece To the combination of communication or the connection of external system and equipment, such as using passive IR proximity sensor and external digital signal at Manage device.Passive IR proximity sensor monitoring night scene, sensor instruction movement can be used in soldier, and eyepiece starts to The connection of external digital signal processor is to help to identify the target from proximity sensor data.In addition, IR image camera It can be initiated to contribute additional data to digital signal processor.In embodiments, as described in this, other sensings are defeated Enter and/or sensing equipment, interface can also be applied to the communication or connection of external system and equipment etc. from platform.
It in one example, may include using sensing input/sensing equipment and to be controlled useful in terms of the control of eyepiece The combination of external equipment, such as using acoustic sensor and weapon system, wherein sensing loud sound by the eyepiece that soldier wears Sound (such as may be explosion or the report of a gun), and wherein eyepiece starting to the control of weapon system for being directed to and loud sound The possibility movement of the associated target of generation of sound.For example, soldier is carrying out guard duty, and hear the report of a gun.Eyepiece It is able to detect the direction of the report of a gun, and soldier is directed to the position that the report of a gun is made.In embodiments, as retouched herein It states, other sensing inputs and/or sensing equipment, useful external equipment to be controlled etc. can also be applied.
In one example, the controlling party face of eyepiece includes using sensing input/sensing equipment and those useful external equipments Application combination, such as indicated using camera and applications.The camera being embedded in soldier's eyepiece can check table The bright available target icon of instruction, and eyepiece accesses applications to be indicated.For example, soldier is delivered to build-up area Domain, and when entering, eyepiece camera checks icon, externally access instruction and is directed to the instruction what is made to soldier's offer, Wherein all steps can be automatically, so that providing instruction in the case where soldier does not know icon.In embodiments, As described in this, other sensing inputs and/or sensing equipment, the application of external equipment etc. can also be applied.
It in one example, may include using sensing input/sensing equipment and to user and outside in terms of the control of eyepiece The combination of equipment and the related feedback of application, such as using GPS sensor and from the visual display of remote application.Soldier can have Have and position coordinates are sent/be streamed to the embedded GPS sensor of remote location mechanism/application, the remote location mechanism/answer With by the visual display of physical environment around eyepiece is sent/is streamed to for display.For example, soldier can be by eyepiece constantly Ambient enviroment is checked, and by embedded GPS sensor, even if eyepiece allows soldier changing position by constantly stream transmission When also have ambient enviroment augmented reality view visual display covering.In embodiments, as described in this, other are passed Sense input and/or sensing equipment, feedback related with external equipment and/or applications etc. can also be applied.
In one example, the control method of eyepiece may include using user action capture input/equipment and for control or The combination of the user's movement or movement of initiating order, such as utilizes body movable sensor (for example, motion sensor) and arm Movement.Soldier can have the body movable sensor for being attached to their arms, and wherein the movement of their arms transmits order.Example Such as, soldier can have the motion sensor on their arms, and the movement of their arms is copied to aircraft landing illumination In system, so that the lamp usually held by the personnel that help is landed can become bigger and more visual.In each embodiment In, as described in this, other users motion capture input and/or equipment, the user for controlling or initiating order it is mobile or Movement etc. can also be applied.
It in one example, may include capturing input/equipment using user action and inputting wherein in terms of the control of eyepiece The combination of the order/control model and interface that can be reflected, such as wearable sensors set and the use based on prediction inquiry learning Family interface.Soldier's wearable sensors set, is based on wherein the data from the set of sensors are continuously collected and pass through The user interface of study is fed to machine learning mechanism, and wherein soldier can receive, refuse, modifying etc. to act from them and row For study.For example, soldier generally executes same task, machine learning mechanism in each morning Monday in a manner of same physical The routine of acquistion can be established, the routine of the acquistion is supplied to soldier in next morning Monday, such as prompting below: Clean particular device, fill in certain table, broadcasting specific music and particular person meet etc..In addition, soldier can be by routine Direct editing come modify study as a result, such as in the behavioral profiling of acquistion.In embodiments, as described in this, Other users motion capture input and/or equipment, the order and/or control model and interface etc. being reflected in wherein input energy It can be applied.
It in one example, may include capturing in input/equipment and platform to make using user action in terms of the control of eyepiece With the combination for the application ordered/made a response to input, the camera and Video Applications of such as subsidiary finger.Soldier can control mesh The embedded camera of mirror passes through the direction of intrinsic Video Applications shooting video.For example, soldier can check scene of fighting, wherein They must stare (such as maintains vigilance to the new development in belligerent) in one direction, while being shot in different directions (such as current belligerent point).In embodiments, as described in this, other users motion capture input and/or equipment, platform It is upper to use order and/or the application etc. that input makes a response can also be applied.
It in one example, may include input/equipment being captured using user action and from platform in terms of the control of eyepiece Interface to external system and equipment communication or connection combination, such as microphone and speech recognition input additional steering wheel control Interface.Soldier perhaps can change the various aspects of processing vehicle via voice command, which is received by eyepiece and quilt It is delivered to the steering wheel control interface of vehicle (such as by the radio communication between eyepiece and steering wheel control interface).Example Such as, soldier is just in driven on public roads vehicle, and so vehicle has the ideal particular procedure ability for highway.But Vehicle also has other modes for driving at different conditions, it is such as cross-country, in snow, in mud, greatly in the rain, When pursuing another vehicle etc..In such instances, soldier perhaps can vehicle change riving condition when by voice command come Change pattern.In embodiments, as described in this, other users motion capture input and/or equipment, the interface from platform Communication or connection to external system and equipment etc. can also be applied.
It in one example, may include capturing input/equipment using user action and being controlled in terms of the control of eyepiece Useful external equipment combination, such as microphone and speech recognition input additional automobile instrument panel interface equipment.Soldier can make Controlled with voice command with the related each equipment of the instrument board of vehicle, it is such as heating and ventilation, radio, music, bright Lamp, trip computer etc..For example, soldier may drive vehicle to execute task, pass through rough terrain, so that they are not Either hand off-direction disk can manually be controlled vehicular meter disc apparatus.In such instances, soldier can be by right The voice control of eyepiece controls vehicular meter disc apparatus.Such as relative to by the voice control of instrument board microphone system, Voice command by eyepiece be it is specific beneficial because military vehicle can be dipped into very loud acoustic enviroment, thus exist The performance substantially promoted can be provided under the conditions of this using the microphone in eyepiece.In embodiments, as described in this, Other users motion capture input and/or equipment, useful external equipment to be controlled etc. can also be applied.
In one example, the controlling party face of eyepiece includes capturing input/equipment and those useful outsides using user action The combination of the application of equipment such as utilizes joystick device and external entertainment applications.Soldier may have access to game console rod controller And game can be played by external entertainment applications, the multi-player gaming of main memory on such as network server.For example, soldier may Down-time period during positive experience deployment, and in base they access the joystick device docked with eyepiece, eyepiece and with External amusement equipment docking.In embodiments, soldier can network together with the army personnel of other on network.Soldier can have There are preference, the profile etc. of storage associated with game is played.External entertainment applications can be such as according to the deployment of soldier, current standard The pipes such as standby state, required preparation state, passing history, ability rating, command post position, ranking, geographical location, future deployment Manage their object for appreciation game.In embodiments, as described in this, other users motion capture input and/or equipment, outside are set Standby application etc. can also be applied.
It in one example, may include input/equipment being captured using user action and to user's in terms of the control of eyepiece With the combination of external equipment and the related feedback of application, system and tone output or audible alert such as are determined using activity.Scholar Soldier can determine system by eyepiece access activity to monitor and determine such as in extreme activity, rest, boring, anxiety, exercise The active state of soldier whens equal, and wherein when situation exceeds limitation in any way (such as preset, acquistion, typical) When, eyepiece can provide the form of tone output or audible alert.For example, the current health shape of soldier can be monitored during fight State, and wherein when healthiness condition enters danger level, soldier and/or another individual are (for example, doctor, hospital personnel, soldier Another member of team, command centre etc.) it is provided earcon, such as indicate that the soldier is injured in fight.As a result, Other people can be alerted the condition of the injury of the soldier, and can look after the condition of the injury in a more efficient manner.In embodiments, such as It is described herein, other users motion capture input and/or equipment, feedback related with external equipment and/or applications etc. It can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepiece The combination of order/control model and interface that additional wherein input energy is reflected, the fist such as held and can navigating lists.Scholar Soldier can use the postures such as the fist held with a firm grip recall as eyepiece show be projected content can navigating lists.For example, Eyepiece camera can check the hand gesture of soldier, identification and identify hand gesture and according to scheduled posture to command database To execute order.In embodiments, hand gesture may include the posture of hand, finger, arm, leg etc..In embodiments, such as It is described herein, other be used to control or initiate order user is mobile or movement, the order that is reflected in wherein input energy and/ Or control model and interface etc. can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepiece The combination that the application ordered/made a response to input can be used on additional platform, such as nods and shows with information.Soldier can be with all It such as shakes the head, arm motion, leg exercise, eye motion posture are applied to recall information display.For example, soldier can wish to pass through Eyepiece accesses application, database, network connection etc., and can with the nodding of their heads (such as by it is in eyepiece, in soldier Motion detector on head, on soldier's helmet etc. senses) it recalls as graphic user interface a part Display application.In embodiments, as described in this, other are used to control or initiate user's movement of order or movement, put down Order can be used on platform and/or the application etc. that input makes a response can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepiece The additional interface from platform to external system and equipment communication and connection combination, the blink of such as eyes and via to outside The API of application.Soldier can recall application program with the mobile etc. of the blink of eyes, the nodding of head, arm or leg Interface is to access applications.For example, soldier can access applications by the API being embedded in eyepiece facility, it is used in combination The blinks (such as being detected by the Optical Monitoring ability of the optical system via eyepiece) of eyes is done so.In each reality It applies in example, as described in this, other are used to control or initiate the user's movement or movement, the interface to outside from platform of order Communication or connection of system and equipment etc. can also be applied.
In one example, the controlling party face of eyepiece include using the user for controlling or initiating order is mobile or movement with And the combination of external equipment to be controlled, the external rangefinder equipment of access is such as dubbed by foot.Soldier can have and will examine The sensor (dynamic pickup etc. on their shoes) of the movement of soldier's foot is surveyed, and soldier is (all using the movement of foot Such as dubbing for their feet) carry out the distance determined using external rangefinder equipment to an object (such as enemy targets).For example, soldier Just weapon system can be aimed at, and use two hands in the process.In this case, made by eyepiece by foot-propelled Mode allows " without hand " to issue order to issue order.In embodiments, as described in this, other are for controlling Or user's movement or movement, useful external equipment to be controlled for initiating order etc. can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepiece The combination of the application of those additional useful external equipments, such as makes mark and messenger with hand.Soldier is available Hand shape at mark come by external information transmission application (such as external information feeding, photo/video sharing application, text are answered With etc.) the shared information of triggering.For example, soldier opens embedded camera using hand signal, and video flowing and another people are divided It enjoys, share and arrive storage etc..In embodiments, as described in this, other be used to control or initiate order user it is mobile or Movement, application of external equipment etc. can also be applied.
It in one example, may include using the user's movement or movement for controlling or initiating order in terms of the control of eyepiece The additional combination with external equipment and the related feedback of application to soldier, additional audible alert of such as shaking the head.Soldier is wearable Equipped with accelerometer (or similar sensor that can be used in detecting gravity and shake the head) eyepiece, wherein when soldier's experience is in danger When shaking the head, audible alert is heard the high-caliber gravity of danger as the feedback to user, such as or as applying on eyepiece A part otherwise a part as the application for being detached from eyepiece determine.In addition, the output of accelerometer can be recorded and be deposited Storage is for analysis.For example, soldier can undergo the gravity generated by close explosion to shake the head, and eyepiece can sense and record and be somebody's turn to do It shakes the head related sensing data.In addition, shaking the head for danger level can trigger the auto-action made by eyepiece, such as to it His soldier and/or to command centre's transmission warning, start to monitor and/or transmit the soldier from the other sensors worn with it Health and fitness information, to soldier provide audible indicate etc. related with their the possible conditions of the injury.In embodiments, as described herein , other are used to control or initiate user's movement of order or movement, feedback related with external equipment and/or applications etc. It can also be applied.
It in one example, may include using the order being reflected in wherein input energy/control mould in terms of each control of eyepiece The combination for the application ordered/made a response to input can be used on the additional platform of formula and interface, such as graphic user interface is additional Reside in the various applications on eyepiece.Eyepiece can provide graphic user interface to soldier and present using for selection.For example, Soldier can have the graphic user interface projected by eyepiece, which provides the application of different field, such as military, a People, citizen etc..In embodiments, as described in this, other order/control models being reflected in wherein input energy and boundary Order can be used on face, platform and/or the application etc. that input makes a response can also be applied.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepiece With the combination of communication or the connection at the additional interface from platform in interface to external system and equipment, such as 3D navigates outside eyepiece interface It is added to the navigation system control interface of external system.Eyepiece can enter navigation mode and be connected by navigation system control interface It is connected to external system.For example, soldier is just holding military maneuvers and is recalling pre-loaded surrounding terrain by eyepiece navigation mode 3D rendering, and eyepiece is automatically attached to external system to be updated, current perpetual object (is such as covered by satellite image Lid) etc..In embodiments, as described in this, other order/control models being reflected in wherein input energy and interface, Interface can also be applied to the communication or connection of external system and equipment etc. from platform.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepiece With the combination of the additional external equipment to be controlled in interface, such as augmented reality interface applied external tracking equipment.The mesh of soldier Mirror can enter augmented reality mode and and external trace device to fetch with augmented reality show covering and object to be tracked or The related information in the position of people.For example, enhancing display pattern may include 3D map, and the people determined by external trace device Position can be covered on map, and with the mobile display track of tracked people.In embodiments, as described in this, Other order/control models being reflected in wherein input energy and interface, useful external equipment to be controlled etc. can also be answered With.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepiece With the combination of the application of those additional external equipments of interface, such as translucent additional simulation application of display pattern.Eyepiece can be set to To enhance the display that simulative display is applied to soldier in translucent display pattern.For example, soldier is prepared for task, And before entering battlefield, soldier is provided the simulation of task environment, and checks him since user being not present during simulation Around true environment actual needs, therefore eyepiece places it in translucent display pattern.In embodiments, such as exist This description, other order/control models being reflected in wherein input energy and interface, the application of external equipment etc. can also be answered With.
It in one example, may include using the order/control model being reflected in wherein input energy in terms of the control of eyepiece With the additional combination to user and external equipment and the related feedback of application in interface, such as additional tone in audible command interface is defeated It feeds back out.Eyepiece can be placed in audible command interface model by soldier, and eyepiece is exported back and forth to should be used as coming from tone and is The eyepiece of system is ready to receive the feedback of audible command.For example, audible command interface may include external position (such as on network Outside) in audible command interface at least partly, once and whole system be ready to receive audible command, tone is just It is provided.In embodiments, as described in this, other order/control models being reflected in wherein input energy and interface, Feedback related with external equipment and/or applications etc. can also be applied.
It in one example, may include that the application ordered/made a response to input can be used on platform in terms of the control of eyepiece The additional interface from platform to external system and equipment communication or connection combination, the additional network routing of such as communications applications Device, wherein soldier can open communications applications, and eyepiece automatically searches for network router to search out the company of the network facilities It connects.For example, soldier is just with their group in battlefield, and new campsite is established.Once communications facility has been set up, The eyepiece of soldier is just connectable in safe wireless connection.In addition, once communications facility has been set up, even if soldier does not have also Paid to ping letter, eyepiece can also remind soldier.In embodiments, as described in this, on other platforms can use order/it is right Input that the application made a response, interface can also be applied to the communication or connection of external system and equipment etc. from platform.
It in one example, may include using order/make a response to inputting can be used on platform in terms of the control of eyepiece Using the combination of additional useful external equipment to be controlled, such as Video Applications applied external camera.Soldier can be with deployment Camera docking, such as monitoring battlefield.For example, movement, which can dispose camera, to fall from aircraft, and soldier is then led to Eyepiece Video Applications are crossed with the connection to camera.In embodiments, as described in this, the energy on other platforms uses life Application, the useful external equipment to be controlled etc. for enabling/making a response to input can also be applied.
It in one example, may include using order/make a response to inputting can be used on platform in terms of the control of eyepiece Using the combination of the application of applied external equipment, search is applied using applied external search on such as eyepiece.Search on eyepiece Using can be enhanced with external search application.For example, soldier can search for the identity for the individual being just asked, and searched when on eyepiece When rope causes not find, eyepiece connects external search facility.In embodiments, as described in this, energy on other platforms Using order/to input make a response application, external equipment application etc. can also be applied.
It in one example, may include using order/make a response to inputting can be used on platform in terms of the control of eyepiece Using the additional combination with external equipment and the related feedback of application to soldier, the additional performance indicator of such as entertainment applications is anti- Feedback.Entertainment applications are used as needing to rest but may be due to the rest mechanism of the soldier of other reasons anxiety, and table Now feed back by for soldier's design at given conditions, such as when in the deployment that they need to rest but keep quick, During attention is declining and needs to be brought back idle hours when coming etc..For example, soldier can on means of transport and will To enter belligerent.In such instances, entertainment applications can be action thinking game to improve attention and enthusiasm, and its Middle performance indicator feedback, which is designed to maximize soldier to execute and to ponder a problem in a manner of quickly and efficiently, draws a conclusion Expectation.In embodiments, as described in this, on other platforms can use order/to input make a response application, with External equipment and/or the related feedback of applications etc. can also be applied.
In one example, may include in terms of the control of eyepiece using from platform interface to the communication of external system and equipment Or the combination of the additional external equipment to be controlled of connection, such as thrown to processor interface applied external on the eyepiece of outside plant Shadow instrument.Eyepiece processor is connectable to external projector, so that other people can check to the available content of eyepiece.For example, Soldier in battlefield and can access them and need not wearing the shared content of people's (such as non-military individual) of eyepiece with other. In this example, the eyepiece of soldier perhaps can be docked with external projector, and content is fed to projection from eyepiece Instrument.In embodiments, projector can be pocket projectors, the projector in vehicle, the projector in meeting room, remotely determine The projector etc. of position.In embodiments, projector can be also integrated into eyepiece, so that content can be from integrated projector From outer projections.In embodiments, as described in this, other from platform interface to external system and equipment communication or Connection, useful external equipment to be controlled etc. can also be applied.
In one example, may include in terms of the control of eyepiece using from platform interface to the communication of external system and equipment Or the combination of the application of connection applied external equipment, such as audio system controller interface applied external audio system.Soldier's energy It is enough that the audio-frequency unit (for example, music, audio playback, audio network file etc.) of eyepiece facility is connected to external sound system. For example, soldier perhaps can repair the communication for just being received vehicle sounds system by eyepiece, so that other people can hear.Each In embodiment, as described in this, other from platform interface to the communication or connection of external system and equipment, external equipment Using etc. can also be applied.
In one example, may include in terms of the control of eyepiece using from platform interface to the communication of external system and equipment Or the additional combination with external equipment and the related feedback of application to soldier of connection, such as additional shape in steeper controller interface State feedback.Soldier can by steeper controller interface using digital steeper control access and control mechanism, wherein mechanism to User provides the feedback about the mechanism status.For example, the soldier for removing roadblock can have lifting on their vehicle Mechanism, and soldier can directly be docked by eyepiece with the elevating mechanism.In embodiments, as described in this, other From platform interface to external system and equipment communication or connection, it is related with external equipment and/or applications feedback etc. It can also be applied.
It in one example, may include using those additional external equipments of external equipment to be controlled in terms of the control of eyepiece Application combination, such as enable the additional automated back-up application of equipment of storage.Soldier in battlefield can be provided that data store Facility and associated automated back-up application.For example, storage facility can be located in military vehicle, so that data can be from multiple The eyepiece of soldier backups to vehicle, especially in the case that network linking is not useable for downloading to remote backup site.Storage is set It is standby can it is associated with campsite, associated with soldier's subset (for example, in group) in battlefield, on soldier they itself Deng.In embodiments, when network service connection is made available by, facility, which is locally stored, can upload backup.In embodiments, As described in this, the application etc. of other useful external equipments to be controlled, external equipment can also be applied.
In one example, in terms of the control of eyepiece may include using external equipment to be controlled it is additional to soldier with it is outer The combination of portion's equipment and the related feedback of application, such as additional feedback from system of external payment system.Soldier may have access to army The payment system of thing management, and wherein the system to soldier provide feedback (for example, receipt, account balance, account behaviors etc.). For example, soldier can make payment to seller by eyepiece, wherein eyepiece and external payment system exchange data, authorization, fund etc., And payment system provides feedback data to soldier.In embodiments, as described in this, other to be controlled useful outer Portion's equipment, feedback related with external equipment and/or applications etc. can also be applied.
It in one example, may include using external equipment using additional setting with outside to soldier in terms of the control of eyepiece Standby and the related feedback of application combination, the information such as from external 3D map-rendering facility shows additional aobvious together with information Show feedback.Soldier perhaps can be such that 3D map information data shows by eyepiece, and wherein map facility can be passed such as according to passing The information of the information, passing request sent, other people request in the region, basis change associated with geographic area To provide feedback to soldier.For example, soldier can receive the rendering of 3D map from applications, wherein applications are also to identical geography At least the second soldier in region provides the rendering of 3D map.Soldier then can receive related with the second soldier from outside plant Feedback, their position described in the rendering of 3D map, identity information, mobile history etc..In embodiments, such as Described herein, the application of other external equipments, feedback related with external equipment and/or applications etc. can also be applied.
In embodiments, in response to medical conditions, eyepiece can provide a user various forms of guidances.Show as first Example, user, which can be simulated for training goal using eyepiece, to be sent out in fight, in training, when on duty or not on duty etc. Raw medical conditions.The simulation can be adjusted towards medical professional or non-medical personage.
As an example, eyepiece can be used to check that the medical simulation as training module a part is come for low-level fight soldier It provides for the training in response to the medical conditions on battlefield.Eyepiece can provide enhancing environment, and wherein user, which checks, is covered on separately The condition of the injury on one soldier simulates those conditions of the injury that are afield common or can afield finding.Soldier can then pass through User interface obtains prompt to make a response to the situation presented.User can be given for afield providing emergency medical The gradually instruction of the series of actions of relief or user may be in response to the situation and execute movement, these movements are then repaired straight It is presented to response appropriate.
Similarly, eyepiece can provide the training environment for medical professional.Eyepiece can be presented to user needs medical treatment The condition or situation of response are with the training goal for medical professional.Eyepiece can release required for its user Grasp the common scene of battle field of response and lifesaving skill appropriate.
As an example, the augmented reality of wounded soldier can be presented to user, wherein soldier's body has bullet wound.Medical professionalism people Scholar can then carry out him and feel it is to select him to feel to this from the user interface of eyepiece the step of suitably response for the situation Situation proper step, the user interface that step is input to eyepiece are medium.User can pass through sensor and/or input equipment Using to carry out response or he user interface can will be input to via eye movement, hand gesture etc. the step of his response.Class As, he can select the appropriate step for being presented to him by user interface via eye movement, hand gesture etc..Work as movement It is implemented and when user makes the decision about treatment, additional guidance and instruction can be presented to user according to his performance. For example, and user starts for soldier to be raised to danger position, Yong Huke if the soldier that chest has bullet wound is presented to user It is cautioned or prompts the therapeutic process to change him.Correct step can alternatively be prompted the user with to implement mistake appropriate Journey.In addition, the example of the medical records of wounded soldier can be presented to trainee under Training scene, wherein user is possible must be extremely Few decision that him is made according to including the content in the medical records.In various embodiments, the movement and performance of user It can be recorded and/or put on record for making further to judge and refer to after training course is suspended or is stopped in other ways by eyepiece Show.
In embodiments, in response to true medical conditions in fight, eyepiece can provide a user various forms of fingers It leads.As an example, can be prompted to unbred soldier under condition when doctor cannot occur immediately The gradually lifesaving of comrade-in-arms indicates.When comrade-in-arms's injury, user can input the type of the condition of the injury, eyepiece can detect the condition of the injury or these Combination can occur.At this point, the lifesaving instruction for being used to treatment wounded soldier can be provided a user.Such instruction can be for use It is presented in the step-by-step procedure of the instruction at family with the form of augmented reality.In addition, eyepiece can be provided a user about close to injured The enhancing for dissecting covering etc. of the position of the vitals of soldier's condition of the injury, soldier's body visually helps.In addition, eyepiece can be to this Situation is recorded a video, which can then be sent back to doctor not afield or rush for the doctor in battlefield, to permit Xu doctor instructs untrained user by lifesaving skill suitable on battlefield.In addition, the eyepiece of wounded soldier can incite somebody to action Important information is sent to the eyepiece of soldier being treated (such as by the related injured scholar of integrated or associated sensor collection The information of soldier), which, which is sent to doctor or it, can be sent straight to doctor in remote location, so that treating Soldier can according to the information collected from wounded soldier eyepiece come to wounded soldier provide medical treatment help.
In other embodiments, when the condition on battlefield is presented, mesh is can be used in trained doctor Mirror covers to provide the dissection of soldier's body, so that he can more suitably make a response the situation on hand.It is only used as and shows Example is without limiting the present invention, if wounded soldier, just because the bullet wound of leg is bled, the enhancing that soldier's artery can be presented to user is existing Real-time coupling, so that user can determine whether artery is hit and injury has mostly seriously.It can be presented to user by eyepiece For the appropriate draft of given wound, so that he can check each step over the course for the treatment of.Such draft can also use increasing Strong reality, video, audio or other formats are presented to the user.Eyepiece can provide physicians with existing with the enhancing in step-by-step procedure The draft of real instruction form.In embodiments, the augmented reality covering of wounded soldier organ can be presented to user also to lead to Any process is crossed to guide doctor, so that doctor will not make other injury to the organ of soldier over the course for the treatment of.This Outside, eyepiece can provide a user the dissection covering of the position about the vitals close to the wounded soldier condition of the injury, soldier's body Deng enhancing visually help.
In embodiments, eyepiece can be used for scanning the retina of wounded soldier afield to obtain his medical treatment Record.This can remind the possible drug allergy of doctor or can provide other material particulars of benefit in medical procedure.
In addition, if wounded soldier wears eyepiece, then equipment can be by heart rate, blood pressure, respiratory pressure including wounded soldier Information Deng including is sent to the glasses of doctor.The eyepiece may also aid in the gait that user observes soldier, whether to determine soldier With craniocerebral injury and they can help user determine bleeding or injury position.Such information can provide a user related The information of possible medical treatment, and in embodiments, draft appropriate or to the selection of draft can be displayed to user with Him is helped to treat patient.
In other embodiments, eyepiece allows user to monitor other symptoms of patient for psychological health states inspection. Similarly, user can check to determine it is mobile and further use eyepiece to patient whether patient is showing quick eye Sedation treatment is provided, provides the mobile exercise of eye, respiratory training etc. to patient.In addition, when the life about wounded soldier When sign and the information of health data are collected and sent to the eyepiece of doctor from the eyepiece of wounded soldier, doctor be can be provided that The information.This can provide physicians with the real time data from wounded soldier, without himself such as by measuring injured scholar The blood pressure of soldier determines such data.
In various embodiments, the prompting from eyepiece can be provided a user, which tells him the rescue of aerial or ground It is how far to leave his position afield.This can provide physicians with important information and remind to him can give the situation In the case where with the time, whether certain processes should or must be attempted, and this can be gratifying to injured soldier's offer Rescue just knowing or remind he him may need other to help sources on the way.
In other embodiments, if detection is gone wrong, user can be provided that the warning of the vital sign of himself.Example Such as, if the hyperpiesia of soldier, he can be warned, to alert him, he must take medicine or if possible by himself It is withdrawn from fight his blood pressure is returned to level of security.Also, user can be warned about other such a numbers According to, his pupil size, heart rate, gait change etc., to determine whether user is undergoing medical problem.In other implementations In example, the eyepiece of user can also remind the medical condition of user to the healthcare givers of another location, be directed to user's to send Help whether to know that he needs such help but regardless of him.In addition, general data can assemble from multiple eyepieces with to finger It waves official and the details such as the wounded soldier about him, his how much injuries of how many soldier in fight, in them is provided.
In various embodiments, trained medical professional can also use mesh in the medical response except fight Mirror.Such eyepiece has doctor described above in general headquarters or not in general headquarters but the similar use except Combat Condition.It is logical This mode is crossed, eyepiece can be provided a user obtains augmented reality assistance, medical procedure of putting on record, long-range during medical procedure Execute the mode of medical procedure under the guidance of commanding officer in military base in military base or not via video and/or audio etc.. This can provide assistance under a variety of situations that wherein doctor may need additional assistance.When doctor is just outside training routine, gymnastics Out, one example of this situation can occur when military pleasure trip on foot etc. is on duty.When doctor is only respondent, when he is newly to cure It is raw, close to new situation etc. whens, such assistance may be important.
In some embodiments, eyepiece can provide user guided in environment related with military transportation airplane.For example, when instruction Practice, into war, in investigation or rescue duty in, in mobile device, aboard execute maintenance etc. whens, eyepiece can by with In this environment.It is such to use the personnel for being suitably adapted for various grades and rank.
For purposes of illustration, user can be on transporter and by eyepiece reception audio when entering training routine and visually Information.The information can provide a user the details about training mission, such as condition of battlefield, weather condition, task instruction, region Map etc..The true war scene of eyepiece analog is ready to make user be directed to fight.Eyepiece can also pass through various hands The response and movement of segment record user.The feedback for the performance that such data collection allows user to receive about him.In addition, mesh Mirror then can according to the result of acquisition come during training routine change simulation with when simulate it is underway when change the simulation or Change for the simulation of the future of user or each user.
In embodiments, when military transport is confidential enters fight, eyepiece can provide user on the military transportation airplane Guidance and/or interaction.User can receive the audio and visual information about task in user's aboard.Can be in user Show checklist to ensure that his has the suitable material and equipment of task.In addition, being used for the appropriate use of fixed equipment and safety belt Instruction can be presented together with the information (such as position of emergency exit, oxygen cylinder and safety equipment) about aircraft. User can be presented instruction, when rest such as before task and when be administered for this purpose.Eyepiece can be mentioned to user It eliminates for noise for resting before task, and can then terminate in the rest of user and further task preparation will open The user is reminded when the beginning.Additional information can be provided that, vehicle and/or the number of personnel on the map of such as battle field, battlefield Amount, weather condition of battle field etc..Equipment can provide the link of other soldiers, so that instruction and fight preparation may include Soldier's interaction, wherein commanding officer can be heard by subordinate etc..In addition, the information for each user can be formatted to be suitble to his Specific needs.For example, commanding officer, which can receive, may need not be provided to the higher level of lower grade official or more maintain secrecy Information.
In embodiments, user can use eyepiece in investigation or rescue duty on military transportation airplane, wherein flying Eyepiece captures and stores the various images and/or video in place interested when crossing each region, can be used for obtaining about potentially The information of face battle field etc..Eyepiece can be used for detecting movement and thereby the detection enemy to be defeated of the people and vehicle on ground People or the friendly troop that succour or assist.Eyepiece can provide for label to be applied to and fly over and the map in region searched for or image Ability, to be encoded to being searched or searched region still being needed to provide specific color.
In embodiments, be provided to will be by the equipment of stock, to be moved by the user on military transportation airplane The instruction of quantity and position and/or checklist, and indicated for the specially treated of various equipments.When article is being unloaded or loaded When can provide a user for close to vehicle warning to ensure safety.
For the maintenance and safety of military transportation airplane, preflight check can be provided a user for the correct fortune of aircraft Make.If can be alerted before task without completing correctly maintenance, pilot.In addition, can be provided to aircraft operators The graphical overview of aircraft history or list track the history of craft preservation.
In some embodiments, eyepiece can provide user guided in environment related with military fighter aircraft.For example, when instruction Practice, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various grades With the personnel of rank.
As an example, eyepiece can be used for the training fought to military fighter aircraft by user.Simulation can be presented to user in spy Determine the augmented reality situation of the Combat Condition in military jet machine or aircraft.The response and movement of user can be recorded and/or divide Analysis is to provide a user additional information, judge and change training routine according to passing data.
With practical related each embodiment of fighting, it can be presented to user and show surrounding to him and/or close to his friend The information of army and non-friendly troop's aircraft.The information about enemy aircraft, such as maximum speed, mobility can be presented to user And scope.In embodiments, user can receive information related with the appearance of ground hazards and be warned the situation. Eyepiece can be synchronized to the aircraft and/or aircraft instrument and meter of user so that pilot can be seen that urgent warning and about The additional information of aircraft not shown in cockpit generally.In addition, eyepiece can be shown to the number of seconds of target area, root According to bring was threatened from vehicle launch guided missile or the time of pop-up.Eyepiece can be according to ambient enviroment, potential threat etc. It is recommended that motor-driven for pilot's execution.In embodiments, even if friendly troop's aircraft is in undercover operations mode, eyepiece can also It detects and shows friendly troop's aircraft.
In embodiments, preflight check can be provided a user for the correct running of fighter plane.If task it It is preceding without completing correct routine maintenance, then pilot can be alerted by linking with maintenance record, aircraft computer etc..Mesh Mirror allows pilot to check the history of craft preservation and the chart of the history and diagram.
In some embodiments, eyepiece can provide user guided in environment related with military helicopter.For example, instructing Practice, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various grades With the personnel of rank.
As an example, eyepiece can be used for the instruction to the military helicopter operation in fight or high pressure situation by user Practice.The augmented reality situation of Combat Condition of the simulation in given aircraft can be presented to user.The response and movement of user can It is recorded and/or analyzes to provide a user additional information, judge and change training routine according to passing data.
In training and/or course of battle, the eyepiece of user can be synchronized to aircraft for about aircraft Important statistical data and maintenance are alerted.User can check plan and the security procedure for passenger when he climbs up aircraft And emergency procedure.Such program can illustrate how safely to take aircraft, how operate door to enter and leave flight The other information such as position of device, equipment of saving somebody's life.In embodiments, the position and/or orientation threatened can be presented to user in eyepiece, Such as those may generate dangerous threat during the flight of helicopter to it.For example, user can be presented low latitude The position that the position and land that flight (such as target drone, other helicopters) threatens threaten.In embodiments, Noise canceling headsets It can be provided together with eyepiece with multi-user's user interface, to allow to communicate during flight.In the decline of wherein helicopter In situation, position can be transmitted to commanding officer and rescue team with machine information is gone straight up to by the eyepiece of user.In addition, appointing in low-latitude flying Allow user that can not be detected the closing of powerful helicopter spotlight using the night vision of eyepiece during business Enemy is searched for or found in the case where arriving.
In embodiments, as described in each example described herein, eyepiece can be in the tracking of craft preservation Aspect, which provides, assists and determines whether that correct routine maintenance has been carried out.In addition, using other aircraft as mentioned herein And vehicle, augmented reality can be used for providing assistance in terms of the maintenance and operation to aircraft.
In some embodiments, eyepiece can provide user in environment related with military target drone aircraft or robot and refer to It leads.For example, eyepiece can be used in this way when investigating, arresting with rescue duty, fight, generation to the particular risk of the mankind etc. Environment in.
In embodiments, eyepiece can provide a user the video feed in relation to target drone ambient enviroment.For about each The real-time video of the information of region-of-interest can be displayed up to the several seconds.Collecting such information can be provided to soldier about opposing in region The knowledge of quantity, the layout of building of necromancer soldier etc..In addition, data can be collected and sent to from target drone and/or robot Eyepiece is to collect the information about the position for paying close attention to personage that be captured or succour.For example, in safe place or blindage Except user target drone and/or robot can be used to send back the position about the people in safe place, quantity and activity Video or data feeding with prepare capture or succour.
In embodiments, during the use in conjunction with target drone and/or the eyepiece of robot allows commanding officer's collection task Battlefield data make Planning Change and provide the various instructions of team according to the data of collection.In addition, eyepiece and therewith Associated control allows user to pass through the user interface in eyepiece to affix one's name to weapon in target drone and/or robot upper part.From target drone And/or the data feeding that robot is sent can be provided about what weapon will be disposed and when dispose their user information.
In embodiments, allow user close to potential danger situation from the data that target drone and/or robot are collected.Example Such as, this allows user investigation biology spilling, bomb, stoneshot, foxhole etc. when being maintained at user except directly damage The data about situation and environment are provided to the user.
In some embodiments, eyepiece can provide user guided in environment related with marine ships used for military purposes.For example, Training, into war, in search and rescue duty, execute calamity after cleaning, execute maintenance when etc., eyepiece can be used in this way Environment in.It is such to use the personnel for being suitably adapted for various grades and rank.
In embodiments, eyepiece can be used in training to allow user's preparation for the job responsibility of theirs aboard ship The various technical ability combination of performance.Training may include to user's navigation, control ship and/or execute various tasks under Combat Condition Deng the simulation tested of ability.The response and movement of user can be recorded and/or be analyzed to provide a user additional letter Breath is judged and changes training routine according to passing data.
In embodiments, eyepiece can be by providing a user the augmented reality view that potential ship threatens outside horizon To allow user to check the situation.Such threat can be indicated by dot, diagram or other means.Once eyepiece detects Specific threat can be sent to user by eyepiece with the instruction of the belligerent preparation of enemy about carrying out.In addition, user can check at it In they by the map at the harbour of landing pier or video and be provided hostile location.In embodiments, eyepiece allows to use Family and ship and/or weaponry synchronize use of the guidance to user's Navigation Equipment during fight.Eyepiece can be passed through To user reminding international and national hydrosphere where.
In each embodiment for wherein needing to search for and succour, eyepiece can be used for tracking water flow and/or to searching for recently Waters be marked.In each embodiment that wherein water flow is tracked, this can provide a user the concern that transmission will be succoured The potential site of personage or through change position information.Similarly, eyepiece can must investigate each of ambient enviroment in wherein user It is used in environment.For example, user can be alerted significantly changing for hydraulic pressure and/or water sport, it is such to significantly change capable of emittingly The signal of curtain movement and/or upcoming disaster closed on.The threat of change, earthquake and/or tsunami about earth mantle etc. Prompting can be sent to user by eyepiece.Such prompting can pass through tracking sea by the eyepiece synchronous with the equipment on ship Foreign-water movement, water flow change, hydraulic pressure changes, the landing of surrounding water level or promotion etc. provide.
After wherein ships used for military purposes is disposed for calamity in clean each embodiment, eyepiece can be used for the area of detection pollution The speed and the prediction where will stop about depth and pollution that domain, pollution are advanced.In embodiments, eyepiece can by with In the air of 1,000,000 volume of ppm(of the detection pollution volume number of contained pollutant) and variation thereon determine pollution body The variation of product position.
In various embodiments, eyepiece can provide a user plan to check ship and the thereon correct running of equipment. In addition, if before deployment without completing correct routine maintenance, then each operator of ship can be alerted.In each reality It applies in example, user perhaps can check the state of the maintenance history of ship and the important function of ship.
In embodiments, in the environment of submarine, eyepiece can provide a user various forms of guidances.For example, working as Training, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various etc. The personnel of grade and rank.
As an example, eyepiece can be used for the training to the submarine operation in fight or high pressure situation by user.With Family can be presented the augmented reality situation of Combat Condition etc. simulated in particular submarine.Drill program can based on user etc. Grade, so that his grade will determine the type of presented situation.The response and movement of user can be recorded and/or analyze with It provides a user additional information, judge and changes training routine according to passing data.In embodiments, eyepiece may be used also Safeguard submarine, using submarine and it is correct in terms of training user.
In combat environment, eyepiece can be used for providing a user the depth about user, the position of enemy and object, water The information of friendly troop and/or enemy on face.In embodiments, such information can be transmitted in visual presentation by audio etc. To user.In various embodiments, eyepiece can be synchronized to the equipment of submarine and equip and/or utilize the equipment and shape of submarine State collects the data from GPS, sonar etc. to collect various information, the position of other objects, submarine etc..Eyepiece can The instruction about the appearance of enemy in security procedure, task details and region is shown to soldier.In embodiments, equipment can With ship and/or weaponry communication or synchronize user is instructed when using such equipment and is provided and specific dress Standby related display.Such display may include and equip related visual and audio data.As further example, equipment Can be used together with periscope enhance the visual picture of user and/or audio with show the place of potential threat, concern with And shown information may not be able to be carried out by using periscope, it is the position of enemy outside such as visual field, country and international hydrosphere, each Kind threat etc..
Eyepiece can be also used in the maintenance of submarine.For example, it is checked before can providing a user travelling for ship Correct running, it can be to being not carried out before task or remind without completing correct routine maintenance operation.In addition, with Family can be provided that detailed history to check the maintenance etc. of execution.In embodiments, eyepiece can also be by providing augmented reality Or the plan of users is indicated to assist to safeguard submarine in other maintenances as executing.
In embodiments, in port in the environment of ship, eyepiece can provide a user various forms of guidances.For example, When training, into fight, for safeguard etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various The personnel of grade and rank.
As an example, eyepiece can be used to be used for in fight, the ship in the port under attack or high pressure situation for user Training.It can be presented to user and the Combat Condition that may be seen in specific harbour and on such ship is simulated Augmented reality situation.Drill program can show land data from global each harbour and surrounding, to timing Between may be in the data of the quantity of alliance's ship or enemy's ship in harbour, and it can show local gas station etc..Training Plan can be based on the grade of user, so that his grade will determine the type of presented situation.The response and movement of user It can be recorded and/or analyze to provide a user additional information, judge and change training routine according to passing data.? In each embodiment, eyepiece also user can be safeguarded and be executed on ship maintenance of machine, ship use and make on ship It is trained with correct security procedure etc..
In combat environment, eyepiece can be used for providing a user with wherein user by landing pier or by landing pier The related information in harbour.Position about enemy in harbour and/or friendly troop's ship or other visual representations can be provided a user Information.In embodiments, user can get warning about close aircraft and enemy's ship, and user can be with Ship and/or weaponry synchronize to instruct user in terms of using equipment, and are provided simultaneously about the equipment Information and/or display data.Such data may include the quantity and effect of specific munitions etc..Eyepiece can show to soldier and close In the instruction of the appearance of enemy in security procedure, task details and region.Such display may include visual and/or audio letter Breath.
Eyepiece can be also used in the maintenance of ship.For example, it is checked before can providing a user travelling for ship Correct running, it can be reminded being not carried out before task or operating without the correct routine maintenance of completion.In addition, user Detailed history be can be provided that check the maintenance etc. of execution.In embodiments, eyepiece can also by provide augmented reality or It is other to indicate the plan of user in executing such maintenance to assist to safeguard ship.
In other embodiments, eyepiece or other equipment can be used to obtain about those close to the object at harbour in user The biometric information of body.Such information can provide the identity of user and allow user know the people be threaten or it is of interest Someone.In other embodiments, the article or container that user can scan import to harbour are shipped in cargo with finding Potential threat etc..User can according to density or by sensor collection associated with eyepiece or equipment various other information come Detect dangerous substance.Eyepiece can record information or scanning document to determine whether the document is forged in some way or modifies. This can help user to check personal certificate, and it can be used for checking proof document associated with Specific Goods with to Family remind can potential threat related with kinds of goods or problem, inventory, the file of forgery etc. of inaccuracy.
In embodiments, when using tank or other land vehicles, eyepiece can provide a user various forms of fingers It leads.For example, when training, into fight, for monitoring, group transport, for safeguard etc. whens, eyepiece can be used in such environment In.It is such to use the personnel for being suitably adapted for various grades and rank.
As an example, user eyepiece can be used for fight, under attack or high pressure situation when using tank or its The training of its land vehicle.It can be presented to user to when the Combat Condition seen in tank and/or when operating tank carries out The augmented reality situation of simulation.Drill program can be used etc. for correctly equipment and weapon and be tested user.Training meter Drawing can be based on the grade of user, so that his grade will determine the type of presented situation.The response and movement of user can It is recorded and/or analyzes to provide a user additional information, judge and change training routine according to passing data.Each In embodiment, eyepiece can also safeguard tank, using tank and when the correct peace used in tank or whens climbing up vehicle etc. Whole Process etc. training user.
In combat environment, eyepiece can be used for providing a user the position with the enemy army of land and/or friendly troop's vehicle Related information and/or visual presentation.In embodiments, user can get about close aircraft and enemy's vehicle Warning and user can be synchronized with tank and/or weaponry to instruct in terms of using equipment user, and Information and/or display data about the equipment is provided simultaneously.Such data may include the quantity and effect of specific munitions etc.. Eyepiece can show the instruction about the appearance of enemy and friendly troop in security procedure, task details and region to soldier.It is such Display may include visual and audio-frequency information.In embodiments, user can spread 360 sent from the ambient enviroment outside tank View is spent, this can be synchronized to camera or other equipment with such view by using eyepiece to realize.Can to tank/ The internal or external user as much as possible of vehicle provides video/audio feeding.This allows user to monitor vehicle and static prestige The side of body.Eyepiece can with vehicle and various vehicles described herein or will be apparent to those skilled in the art in other ways, Aircraft and equipment are communicated to monitor car statistics data, armoring breakage, engine status etc..Eyepiece can be mentioned further For GPS be used for navigation purpose, and to the use of other technologies black silicon or described herein come detect enemy army and night with And environment is navigate to when non-optimal viewing etc..
In addition, eyepiece can be used in tank/land vehicle environment for monitoring.In embodiments, user is perhaps Camera or other equipment can be synchronized to obtain 360 degree of the visual field to collect information.Night vision and/or SWIR described herein etc. It can be used for further information if necessary to collect.User can be used eyepiece detection heat signal latent to detect to investigate environment Threat, and can check soil density etc. to detect roadside bomb, track of vehicle, various danger etc..
In embodiments, eyepiece can be used for promoting to transport using the group of tank or other land vehicles.For example, can Article to be transported and personal inventory are provided a user, which is visual, interaction etc..User perhaps can track And the inventory of more new article is to track those articles in transit etc..User perhaps can check the ground of peripheral region Figure, scanning prove document and file for the mark of personnel, identification and track in transit individual related article, look into See route/mission bit stream etc. of the individual in transport.
Eyepiece can be also used in the maintenance of vehicle.For example, it can provide a user check before travelling with for tank or The correct running of other vehicles, it can be mentioned to being not carried out before task or operating without the correct routine maintenance of completion It wakes up.In addition, user can be provided that detailed history to check the maintenance etc. of execution.In embodiments, eyepiece can also be by mentioning It assists to safeguard vehicle for the plan of augmented reality or other instruction users in executing such maintenance.
In embodiments, in city or suburban environment, eyepiece can provide a user various forms of guidances.For example, When training, into fight, for monitor etc. whens, eyepiece can be used in such environment.Such use is suitably adapted for various The personnel of grade and rank.
As an example, user eyepiece can be used be used for when fight, under attack or high pressure situation, with locals Member interacts to be trained in city or suburban environment whens waiting.It can be presented to user to seeing when in such environment The augmented reality situation that Combat Condition is simulated.Drill program can be used etc. for correctly equipment and weapon and be carried out to user Test.Drill program can be based on the grade of user, so that his grade will determine the type of presented situation.The sound of user It should can be recorded and/or analyze with movement to provide a user additional information, judge and change example according to passing data Row training.In embodiments, user can check the alternating scene in city and suburb setting, and city and suburb setting include true Building and building layout and potential fight region.Enter the region before, can provide a user weather and Weather information, and can be notified that the number in given time or that time in one day generally in this region is come for can Can attack or other belligerent be prepared.In addition, can provide a user in the building in given area, around and push up On individual position so that enter the environment before user be ready to.
In city and suburban environment, eyepiece or other equipment allow user also to investigate local employee.User can receive Collect face, iris, voice and the fingerprint and palm print data of concern personnel.User can be in the case where not discovering from distance POI0-5 meters, bigger distance or scan such data just beside POI.In embodiments, eyepiece can be used in user Come the environment understanding thoroughly smog and/or being destroyed with mark and record the appearance of vehicle in this region, with record ambient image with For (such as in plan of action) in the future using, the density of population to mark the region in one day each time, various build Build the layout etc. of object and path.In addition, user collects and receives the reality of the specific aborigines associated about soldier Feelings.
When fight, user can use eyepiece or other equipment in urban/suburban environment.Equipment allows user logical Laser range finder is crossed to position using geographical location and destroy unfriendly target.In embodiments, it can provide ambient enviroment With the birds-eye view of building.It can show the enemy in the peripheral region of user and identify individual (such as enemy army or friendly troop or use Those of family group member) position.Eyepiece or other equipment can be used to keep in touch with his general headquarters, pass through eyepiece in user The instruction from commanding officer is checked/listens to, wherein instruction can be made after checking or listening to the data from user environment.This Outside, eyepiece may also allow for user to provide order to other members in his group.In embodiments, user can be to neighbouring Those members execute biometric data and collect, and record such information and/or retrieve the information about them in fight It uses.User can monitor with other soldier's linking of devices and using the various equipments carried by soldier.In embodiments, When the edge for the building that eyepiece can occur quickly to user reminding at roof and when close to ground transformation or when bump into Row warning.Usable family can check the ground map combining of environment and the member of his group, and he is able to detect and to issue Signal and neighbouring possible enemy is alerted to other people near warning.In various embodiments, eyepiece can be used for by user It is communicated with other group members and carrys out executive plan.In addition, user eyepiece can be used detecting in dark tunnel and The enemy in other regions that enemy can be located therein.
Eyepiece can be used in desert Environment.Have in addition to described herein with training, fight, existence, monitoring purpose etc. Close general and/or applicatory use, eyepiece can be further used in can be met in the environment such as desert Environment it is each In kind usage scenario.As an example, eyepiece can be used to correct in fight, monitoring and instruction in user when entering fight or training Decline in white silk through the vision of sandstorm.In addition, eyepiece can in training mode for user simulate sandstorm bad visibility and Other deserts are dangerous.In fight, eyepiece can assist user to see or detect enemy to exist by kind described above mode Appearance in sandstorm.In addition, user can be alerted and/or it can be seen that the husky cloud for causing the sum of Sha Yun to be generated by wind by vehicle it Between difference it is close to be warned potential enemy.
In various embodiments, eyepiece can be used to detect ground hazards and environmental hazard in user.For example, user can be used Eyepiece is come edge, the sand-protecting barrier etc. that detect sand dune.Eyepiece can be used also to detect sand density to detect various danger, such as in user The equipment that hole in the ground, steep cliff, mines and bombs etc. are buried.The map in desert can be presented to user to check such danger Position.In embodiments, it can provide a user and his vital signs be monitored by it and when he is due to extreme environmental conditions The device alerted when (cold, temperature variation, the dehydration of heat, evening in such as one day) is in danger to him.Such warning It illustratively provides in the user interface that can be shown in eyepiece with monitoring and/or is provided by audio-frequency information.
In embodiments, the map in desert can be presented to user to check the position of his group, and he can be used Eyepiece detects neighbouring signal or obtains the warning of possible hostile forces, which is displayed on map or from ear The audio-alert of machine.In such embodiments, user can have the advantage of the enemy compared to him, this is because he can have In sandstorm, in building, the ability of the position of his group of the medium determination of vehicle and enemy.User can check his position Map, the map can will be shown and another color in new region in a kind of color in the region that wherein user advances recently Display.In this way or by other means, equipment may make user not lost and/or keep moving to be correctly oriented It is dynamic.In embodiments, user can be provided that weather satellite covering to remind user's sandstorm and hazardous weather.
Eyepiece can be used in field environment.Have in addition to described herein with training, fight, existence, monitoring purpose etc. Close general and/or applicatory use, eyepiece can be further used in can be met in the environment such as field environment it is each In kind usage scenario.
As an example, user can be used in training using eyepiece to prepare in field.For example, user can be used Eyepiece simulates the degree of the variation of field environment.In embodiments, user can undergo around dangerous animal very Thick and heavy tree/bushes, and in other training environments, he can be subjected to having less place can be to the challenge that enemy hides.
In fight, eyepiece can be used to come for numerous purposes for user.Eyepiece can be used to detect the tree being newly broken in user Branch occurs to detect nearest enemy.In addition, eyepiece can be used to detect dangerous steep cliff, cave, the change of landform, most in user The dust etc. of nearly movement/disturbance.As an example, by detecting the appearance of the dust disturbed recently (if it has and surrounding The different density of dust/leaf or heat signal, then it can be detected or it can be detected otherwise), it uses Family can be alerted trap, bomb or other hazardous equipments.In each environment being described herein, eyepiece is can be used in user It is communicated by user interface or other means with his group, so that communication can be in enclosed environment, sensitive to echo It keeps silent in open environment etc. and/or is not detected by enemy.Also, in each environment, user can be used and be described herein Night vision detect the presence of enemy.User can also check the covering of trajectory diagram and/or high mountain trajectory diagram in eyepiece, so that User can check path before the situation that the potential hazard region of experience and/or enemy can be located therein.It is described herein Each environment in, the sense of hearing that eyepiece can also amplify user carrys out the detection for potential enemy.
In embodiments, user can use eyepiece in field environment, in the case where searching for and succouring service condition.For example, It is mobile to detect soil/leaf that eyepiece can be used in user, with determine it is whether disturbed come be used to track mankind track and For finding the corpse buried.User can check the map in the region being labeled to show by air force and/or other Group member searches for the region of covering, so that user is directed to not searched region from the region searched for.In addition, Eyepiece can be used to carry out the night vision for detecting through trees, undergrowth, bushes etc. for the mankind and/or animal for user.In addition, The presence that the sprig being newly broken is detected by using eyepiece, when in monitoring and/or rescue duty, user is able to detect Pay close attention to personage presence or recently there are.In embodiments, user can also check trajectory diagram and/or high mountain track in eyepiece The covering of figure, so that user can check path before meeting with potential hazard region and/or situation.
In other embodiments, user can field using eyepiece purposes for except land and existence type situation Lower life.As an example, when search of food, user eyepiece can be used track animal dis and it is mobile.In addition, user can incite somebody to action Eyepiece is used for the detection of soil moisture and detects presence and the position of water system.In embodiments, eyepiece can also amplify use The hearing at family detects the animal of potential prey.
Eyepiece can be used in arctic circumstances.Have in addition to described herein with training, fight, existence, monitoring purpose etc. Close it is general and/or applicatory be applicable in, eyepiece can be further used in can be met in the environment such as arctic circumstances it is each In kind service condition.For example, the visual and audio that eyepiece analog user can meet in arctic circumstances is newborn when in training White (white out) weather condition, so that user is adaptable to operate at such pressures.In addition, eyepiece can be to user The program that various situations and scene are simulated according to extremely cold is provided, and the program can be traced and show the prediction heat waste with user Lose related data.In addition, the program can be suitble to carry out the situation that analog subscriber may be undergone in this heat loss.Each In embodiment, which cannot suitably control his four limbs, this may occur in which the reduction of weapon accuracy.At it In its embodiment, help information can be provided a user and about such as burrowing in snowfield the instruction of matters that keeps warm and various Existence skill for arctic condition.In other embodiments, eyepiece can be synchronized to vehicle so that vehicle seems such as it Such as it executes in the specific environment with arctic condition and ice and snow and makes a response like that.Therefore, vehicle can be similarly to user It makes a response and eyepiece can also come like that analog vision and audio in environment as being in user.
In embodiments, user can use eyepiece in fight.Eyepiece can be used to allow him to have an X-rayed milky white day in soldier Gas bar part.User can recall covering map and/or the audio for the information for providing building irrigation canals and ditches, land danger etc. to allow scholar Soldier safely moves in the environment.Eyepiece can remind user to detect the raising or decline of snow density, thus when face of snowing Continent is varied whens indicating possible irrigation canals and ditches, hole or other danger, the object buried in snow etc. to allow him to know.This When outside, under conditions of wherein being difficult to see, no matter avenge and whether hinder the visual field of user, his group member can be provided to him With the position of enemy.Eyepiece can provide heat signal in arctic circumstances also to show animal and individual to user.In each implementation In example, the user interface in eyepiece can show his vital signs to soldier and when he is since environmental condition extreme around is in Warning is provided when dangerous.In addition, eyepiece can also by provide a user from vehicle about transmission slip, wheel slip etc. Prompting help the user to operate vehicle under the conditions of more snow.
Eyepiece can be used in jungle environment.Have in addition to described herein with training, fight, existence, monitoring purpose etc. Close general and/or applicatory use, eyepiece can be further used in can be met in the environment such as jungle environment it is each In kind usage scenario.For example, eyepiece can in training to provide a user can be eaten about which plant, which be it is toxic with And the information that insect and animal may make user on the line.In embodiments, eyepiece analog user may be The various noises and environment met in jungle, so that the environment will not divert one's attention when in fight.In addition, when in fight or very When in real jungle environment, diagram covering or other maps can be provided a user to show the region around him and/or help him Track he from where and he must go where.This can remind allied forces and enemy army in the area to him, and it can Sensing is mobile so as to animal potential near user reminding and/or insect.Such prompting can help user by avoiding attacking It hits with search of food and survives.In other embodiments, the augmented reality such as with diagram mulching method can be provided a user Data allow user by biology and/or animal and those of encountered biology and/or animal is compared to help user area Point which be edible safety, which be toxic etc..The information threatened to user by having specific biology, when When in undercover operations or quiet mode, he can need not weapon deployment.
Eyepiece can also use relatedly with special force task.In addition to described herein and training, fight, existence, monitoring Purpose etc. is related general and/or use applicatory, eyepiece can be further used in possibility related with special force's task In the various usage scenarios met with.In embodiments, eyepiece can be used for the specific use of undercover operations task.For example, with Family can silently be communicated by the user interface that each member can see on his eyepiece with his group completely.User sharing Information can be navigated in the user interface by eye movement and/or controller equiment etc..With user make instruction and/or It navigates in user interface and specific data about the information to be transmitted, other users also can be seen that the data.In each implementation In example, each user can be inserted into the problem of being answered by instruction leader by user interface.In embodiments, Yong Huke Talk or initiate other audios that all users can be heard by their eyepiece or other equipment.This allows afield each User at a position sends plan of action, instruction, problem, shared information etc. and allows them the case where not being detected Under do so.
In embodiments, eyepiece may be additionally used for military fire-fighting.As an example, eyepiece can be used to run fire-fighting in user The simulation of scene.Augmented reality can be used to simulate fire behavior and as time goes by the structural damage of building in equipment It is bad, and it can reproduce scene true to nature with other way.As mentioned herein, training program can monitor the progress of user And/or scene and training module are changed according to the movement of user.In embodiments, glasses can be used in practical fire-fighting. Eyepiece allows user to understand thoroughly smog by various means described herein.User can check, downloads or come in other ways Access the layout of building, container, aircraft, vehicle or the structure caught fire.In embodiments, user will have general view Map or other each group members of display are located at map where.Eyepiece can monitor that user wears or other during fire fighting Equipment.User can see his Oxygen supplied level in his eyepiece and be mentioned when he should withdraw to obtain more oxygen It wakes up.Eyepiece can send the command post of structural outer for the notice from user equipment to dispose new personnel and enter or leave fire , and to update and the prompting to possible fire fighters danger of doing well.User can be such that his vital sign shows, with true His fixed whether temperature is excessively high, loss is too many oxygen etc..In embodiments, eyepiece can be used for the density according to beam, heat Signal etc. come analyse whether in beam or molding in have crack and notify the structural intergrity of user's building or other environment.When When structural intergrity is damaged, eyepiece can provide automatic warning.
In embodiments, eyepiece may be additionally used for maintenance purpose.For example, eyepiece can provide a user before task and/or Correct running using inventory, for article to be used.If correctly maintenance is not logged in the database of article, So it can remind operator.It can provide virtual maintenance and/or performance histories for user to determine the safety of article or be The safety and/or performance requisite measure to be taken.In embodiments, eyepiece can be used for executing augmented reality program etc. To be used for the training user in terms of weapon maintenance and maintenance, and used in related new and/or Advanced Equipment the class to skilled worker Cheng Zhong.In embodiments, eyepiece can be used for various articles (weapon, vehicle, aircraft, equipment etc.) maintenance and/ Or in repairing.User can be used eyepiece and indicate come the visual covering for checking article and/or audio so that user is not needing to hold It is safeguarded in the case where handbook.In embodiments, video, still image, 3D and/or 2D image, the image of animation, sound Frequency etc. can be used for such maintenance.In embodiments, user can check the covering of article and/or the video of various images, So which user, which is shown, will partially remove, with what order removes and how to remove, which will partially be added, be replaced It changes, repair, enhance.In embodiments, such maintenance program can be augmented reality program etc..In embodiments, Eyepiece can be used to connect running to monitor machine or equipment and/or important statistical data with machine or equipment and help in user Repair and/or provide maintenance information.In embodiments, user is able to use eyepiece to suggest next one during maintenance Consecutive movement, and eyepiece can damage machine about such movement to user's transmission, help how repair machine, machine And/or whether by after the following step running etc. a possibility that information.In embodiments, eyepiece can be used for It is that this is mentioned or may be used on military environment in other ways or met in military environment all items, machine, vehicle, set The maintenance of standby, aircraft etc..
Eyepiece may be additionally used for having the language said in wherein user in unfamiliar environment to a certain degree.As showing Eyepiece and/or equipment can be used to obtain the translation of the near real-time of the talker around him to those in example, soldier.Pass through equipment Earphone, he can hear the translation with his mother tongue to his talker.In addition, he can record and translate by the prisoner of war and/or The comment that other internees make.In embodiments, soldier can have can translate phrase or by earphone, pass through user Eyepiece the user interface of translation is provided a user in text image or otherwise.In embodiments, eyepiece can be by Linguist using come to veteran linguist provide about the dialect described in specific region or the institute of the people near him What is said is the supplemental information of what dialect.In embodiments, linguist can be used eyepiece come record instruction sample for into The comparison and/or study of one step.Other experts eyepiece can be used use speech analysis with by monitoring change voice, tone, stutter Etc. come determine speaker whether just undergoing it is angry, ashamed, tell a lie etc..Even if hearer and speaker say different language, this is still The intention of hearer speaker's script can be given.
In embodiments, eyepiece allow user interpret body language from another people and/or facial expression or its Allogene identifies data.For example, user equipment can be used analyze the amplification of the pupil of people, the changing voice of blink rate, sound, body move It is dynamic etc. come determine whether the people tells a lie, it is inimical, under pressure, may be to threaten etc..In embodiments, eyepiece may be used also Whether the data for collecting such as facial expression are being told a lie or may made insecure old detecting and alert user speaker State, is inimical etc..In embodiments, when with group or other individual interact when, eyepiece can provide a user remind with Alert the potential individual with menace that can be disguised oneself as by non-belligerent or common citizen or other individuals.User reminding can To be audio and/or visual, and can appear in the user interface in the eyepiece of user or be covered in the vision of user and/or It is associated with the individual investigated in the sight of user.As described in this such monitoring can user using eyepiece and/or Invisibly collecting data or it from a distant place when equipment can be executed in close distance by camouflage or discontinuous mode, Or it is executed in the case where individual under a cloud is known and/or agreed to.
When handling bomb and other hazardous environments, eyepiece be can be used as.As an example, eyepiece can provide a user About the prompting that the soil density close to roadside changes, the bomb buried can be reminded to user and/or group.In each embodiment In, similar approach can use in various environment, and it is fried to determine whether to find in arctic circumstances etc. to test the density of snow Bullet or other explosives.In embodiments, eyepiece can provide density and calculate to determine whether luggage and/or transport article are inclined to In with unexpected density or falling in by the density except the particular range of transport article.In embodiments, eyepiece can provide Similar density calculates and if density is found to be when falling in the desired extents such as explosive equipment, other weapons, mentions For warning.It will be appreciated by persons skilled in the art that bomb detection can also be via chemical sensor and/or the known way of this field To use and can be used in various embodiments by eyepiece.In embodiments, glasses can be used in bomb disposal processing.It can be to It is existing certain types of fried on how to remove to obtain that user provides augmented reality or other audios and/or visual covering The instruction of bullet.Similar to maintenance program described above, the instruction for disarming a bomb can be provided a user.In each embodiment In, if bomb type be it is unknown, user interface can be provided a user for safe handling and what may be taken connect The instruction for the step of getting off.In embodiments, user can be warned neighbouring potential bomb and can be rendered for safety Ground handles the instruction of the situation, such as how safely fleeing from bomb region, how safely exiting vehicle with bomb, using Family leave bomb it is how close be it is safe, how to be disarmed a bomb by the instruction and the skill level of user that are suitable for the situation Deng.In embodiments, eyepiece can also provide a user the training in such hazardous environment etc..
In embodiments, eyepiece can detect various other danger, biology leakage, chemicals leakage etc. and to user The warning of unsafe condition is provided.In embodiments, user also can be provided that various about in this context and/or such Under the conditions of mitigate situation, become safety and keep other people safety instruction.Although it have been described that the situation with bomb, It is intended to indicate that eyepiece can be used similarly under various dangerous and/or unsafe conditions and be watched out for and be suppressed, and/or Instruction etc. is provided when suffering from such dangerous and dangerous.
In various embodiments, eyepiece can be used in general body-building and training environment.Eyepiece can provide a user all The information of the mile of traveling such as during he runs, goes hiking, walks.Eyepiece can provide a user the number of the exercise such as carried out The information of amount, the calorie to burn etc..In embodiments, eyepiece can provide a user related with specific exercise is correctly carried out Virtual instruction, and it can desirably or ideally provide a user additional exercise.In addition, eyepiece may be provided in wherein body Energy benchmark is exposed to soldier to meet the user interface of the requirement of the specific program for him or other modes.In addition, eyepiece Data related with the number amount and type of exercise being implemented are needed be can provide to make user meet such require.In this way Requirement can be adjusted towards special force's qualification, propaedeutics etc..In embodiments, user can hinder during exercise with virtual Hinder and cooperates together to prevent user from setting up true railing, obstacle etc..
Although there is described herein specific each embodiment and usage scenario, such description be not intended to limit.In addition, Being intended to eyepiece can be used in apparent each example to those skilled in the art.It is also expected to such as being mentioned for specific environment Eyepiece be suitable for using that can be used in other each environment, even if not mentioning specifically therewith.
In embodiments, user may have access to and/or manipulation is stored in secure digital (SD) card, mini SD in other ways Card, other memories, remotely load in tactical network or the information bank stored by other means.The library can be user A part of equipment and/or it can be remote accessible.User equipment may include DVR or be collected for storing by user The other devices and recorded data of information and/or feeding can be by on-demand delivery to other places.In embodiments, which can Including locally threaten image, be listed in threat each individual information and/or image etc..The library of threat can be stored on plate In mini SD card or other devices.In embodiments, it can remotely be loaded in tactical network.In addition, in each embodiment In, information bank may include that program and other information or data useful in terms of the maintenance of military vehicle can be any type Or about any kind of information.In various embodiments, information bank can be used together so that data are transmitted with equipment And/or it is sent to storage medium and user equipment or is sent from storage medium and user equipment.As an example, data can be sent It is sent in eyepiece to user and the library from storage, so that he can check the image of local concern personage.In each embodiment In, data may be sent to that including in soldier's equipment or positioned at long-range library and from including in soldier's equipment or being located at long-range Library in send, and data may be sent to that various equipment described herein and send from various equipment described herein. In addition, data can be sent between various equipment described herein and kind described above library.
In embodiments, military simulation and training can be used.As an example, the scene of game for being generally used for amusement can It is adapted and is used for battlefield simulation and training.The various equipment of all eyepieces as described in this etc. can be used for such mesh 's.Near-field communication can be used in such simulation come the personnel of changing, presentation danger, change strategy and scene and for various Other communications.Such information, which can be posted, provides instruction and/or information with required for it place shared information.It is various Scene, training module etc. can be running on the equipment of user.Merely exemplary and not using to such training Limitation, the eyepiece of user can show augmented reality combat environment.In embodiments, user can be in this environment such as him It is practical to take action and react like that in fight.User can advance or fall back according to his performance.In various embodiments, user Movement can be recorded for the performance feedback to be provided according to him.In various embodiments, regardless of user performance whether It is recorded, user can be provided that feedback.In embodiments, the information puted up as described above can be password or biology is known Other protecting field and/or cryptographically, and it is immediately available or can be used after specific a period of time.It electronically stores in this way Information can be ordered for all change and may be ideal update and by immediate updating.
Near-field communication or other means can be used in training environment and neutralize for safeguarding with total in the place for needing information It enjoys with posted information and provides instruction and/or information.As an example, information can be posted in classroom, laboratory maintenance prevention In, workshop is medium or other needs as from anywhere in training and instruction.The user of all eyepieces as described in this etc. Equipment allows such information transmission and receives.Information can be shared via augmented reality, and user meets in the augmented reality Specific region and once obtaining such information notice once him there.Similar to described herein, near-field communication can by with In maintenance.As an example, information, which can be accurately pasted onto, needs its place, such as in maintenance prevention, in workshop, It is associated with the article to be repaired etc..It more specifically but is not limitation of the present invention, repairing instruction can the army of being posted within With under the hood of vehicle and can be visible by using the eyepiece of soldier.Similarly, various instructions and training information can be any In given training condition with various user sharings, such as the training for fight and/or for the instruction of military equipment maintenance Practice.In embodiments, the information puted up as described above can be password or bio-identification protecting field and/or cryptographically simultaneously And it is immediately available or can be used after specific a period of time.It can be in time for all with such information of electronics situation storage Change order and may be ideal update and be updated.
In embodiments, being applied to application of the invention can be used for face recognition or sparse face recognition.It is such One or more facial characteristics can be used to exclude the possibility when mark pays close attention to personage in sparse face recognition.Sparse face recognition Can have automatic obstacle masking and mistake and angle to correct.In embodiments, as example and not be to the present invention Limitation, eyepiece, flash lamp and equipment described herein allow sparse face recognition.This with can be similar to human vision work Make, and by rapidly excluding unmatched each region or entire wheel using sparse matching to all image vectors immediately It is wide.This, which may make, almost impossible there is false positive.In addition, this can simultaneously be expanded vector space and be mentioned using multiple images Rise accuracy.This can according to availability or operability require come either with multiple databases or with multiple target images come together Work.In embodiments, equipment can manually or automatically identify the one or more for having least reduction in terms of accuracy Specific clean feature.As an example, accuracy can be various ranges, and it can at least for nose 87.3%, For eyes 93.7% and for the 98.3% of mouth and chin.In addition, the angle correct by face reconstruct can be used, And in embodiments, it can be achieved that the at most 45 degree of angle corrects for passing through face reconstruct.This can with 3D rendering mapping techniques come It further enhances.In addition, fuzzy region masking and replacement can be used.In embodiments, can respectively for sunglasses and Scarf realizes 97.5% and 93.5% fuzzy region masking and replacement.In embodiments, ideal input picture can be 640 Multiply 480.Target image may be due to long range or atmosphere masking and to be less than 10% input resolution ratio reliable matching.In addition, Particular range as mentioned above in various embodiments can be greater or lesser.
In various embodiments, equipment described herein and/or network can be applied to the mark to friend and/or allied forces And/or tracking.In embodiments, face recognition can be used for identifying friend and/or friendly troop for certain.In addition, real-time network The real-time network of tracking and/or blue force and Red Army track allow user know he allied forces and/or friendly troop where.In each reality It applies in example, visual separation model may be present between blue force and Red Army and/or the army identified by various marks and/or mode It encloses.In addition, user can geo-location enemy and in real time share enemy position.In addition, the position of friendly troop can also be real-time It shares on ground.Equipment for such application can be bio-identification described herein collect glasses, eyepiece, other equipment and Those equipment well known by persons skilled in the art.
In embodiments, in medical treatment when equipment described herein and/or network can be used in diagnosis.As an example, Such equipment aloows healthcare givers to make remote diagnosis.Furthermore and as an example, when battlefield doctor reaches scene When or remotely, they the equipment of fingerprint sensor etc. can be used transfer at once the case history, anaphylaxis, blood group of soldier with And other time-sensitive medical datas take most effective treatment.In embodiments, such data can be by can be via What eyepiece described herein or another equipment were realized transfers face recognition, iris recognition of soldier etc..
In embodiments, user can share various data by various networks described herein and equipment.As showing The video wireless transceiver of example, 256 AES encryptions can be bidirectionally total between each group and/or between the computer of vehicle Enjoy video.In addition, biometric data set, potential registration, mark and the verifying for paying close attention to personage, the bio-identification for paying close attention to personage Data etc. can locally be shared and/or remotely be shared by wireless network.In addition, potentially paying close attention to this mark of personage It can be by locally shared and/or by the data of wireless network teleengineering support complete or provide help with verifying.It retouches herein The biological recognition system and equipment line stated also are enabled to through network share data.In embodiments, data can with it is various Equipment, individual, vehicle, position, unit etc. are shared or share from them and/or share between them.In each embodiment In, it is understood that there may be in unit and unit communicates outside and data sharing.Data can carry out shared or from following via the following terms Share in and/or share between the following terms: there are 256 to add for the existing communication resource, mesh network or other networks Close military control type ultra-wideband transceiver cap, military control type cable, mobile SD and/or micro- SD memory card, Humvee(are brave Horse), PSDS2, unmanned vehicle, WBOTM or other relays, fight radio, netted networking computer, all The equipment of various equipment such as, but not limited to, described herein, the computer of biology phone 3G/4G networking, digital archives, tactics are made War center, command post, DCSG-A, BAT server, individual and/or group of individuals and any eyepiece described herein and/or Equipment known to equipment and/or those skilled in the art etc..
In embodiments, equipment described herein or other equipment may include checking pane, this checks pane oppositely Image projecting is checked on any surface for carrying out fight group by squad and/or group group leader.Transparent checks pane Or it is other check pane can in projection mode by rotation 180 degree or another degree with group and/or the shared number of multiple individuals According to.In embodiments, including but not limited to the equipment of monocular and binoculars NVG can with use in it is all or Nearly all Tactical Radio docking, and user is allowed to know in real time or in other ways to share live video, S/A, biology Other data and other data.Equipment as binoculars and monocular as mentioned above can be independent VIS, NIR and/or SWIR binoculars or monocular, and including colored day night view and/or number display, and with tightly It gathers, encrypt, enabling wireless computer to be docked with Tactical Radio.Various data can pass through fight radio, netted Network and long-range tactical network come in real time or near real-time share.In addition, data can be organized into digital archives.Pay close attention to people Whether the data of object (POI) can be organized into digital archives, register but regardless of such POI.In embodiments, it is shared Data can be compared, be manipulated.Although referring to specific equipment, any equipment can be as described in this as mentioned herein And/or as those skilled in the art will appreciate that as shared information.
In embodiments, the data of biometric data, video and various other types can pass through various equipment, method It is collected with device.For example, fingerprint can be collected from the weapon and other objects from war, terrorist activity and/or scene of a crime With other data.Such collection can be captured by video or other means.Pocket biological camera, flash of light as described in this Lamp and embedded Still Camera, various other equipment described herein or other equipment collect video, record, monitoring and It collects and identifies bio-identification photographed data.In embodiments, various equipment are recordable, collect, mark and verify data with And with face, fingerprint, latent fingerprint, latent palmmprint, iris, sound, articles in pocket, scar, related biometric data of tatooing And other mark witness markings and environmental data.It is that data can be geo-location and use date/time stamp.Equipment can be caught Catch the specific image for meeting EFTS/EBTS to match software by any bio-identification to match and file.In addition, executable Videoscanning and potential matching for embedded or long-range iris and facial database.In embodiments, various bio-identifications Data can be captured and/or compare for database and/or it can be organized into electronic record.In embodiments, at Picture and detection system can provide bio-identification scanning, and allow the feature tracking and iris recognition to multiple objects.Object can To move into crowd or removal crowd at a high speed and can immediately be identified, and local can be executed to such image and/or data And/or remotely storage and/or analysis.In embodiments, multi-mode living things feature recognition can be performed in equipment.For example, equipment can It collects and identifies face and iris, iris and latent fingerprint, the various other combinations of biometric data etc..In addition, equipment can be remembered Record video, sound, gait, fingerprint, latent fingerprint, palmmprint, latent palmmprint etc. and other distinguishing labels and/or movement.Each In a embodiment, being manually entered to be returned of the additional actuating section data capture of most specific image is can be used in biometric data Shelves.Data can by automatically geo-location, with time/date stamp, and with locally or the GUID filing of network distribution is to number In word archives.In embodiments, equipment can record the bat print of 4 fingers of complete fingerprint sensitive scanning and roll printing, fingerprint are clapped Print and roll printing, palmmprint, finger tip and fingerprint.In embodiments, operator can collect POI when inspecting Territorial Army and with airborne Or remote data base verifies POI.In embodiments, equipment may have access to Web portal and enable the audit listing of bio-identification Database and/or may include for POI acquisition existing bio-identification preliminary hearing software.In embodiments, bio-identification can lead to The bio-identification matching software of any approval is crossed to match and file for sending and receiving safe unabiding voice, video And data.Equipment is combinable and/or analyzes bio-identification content in other ways.In embodiments, biometric data can Be collected into biometric identification criteria image and data format, these formats can by it is mutual referring to come for defend biological authority portion Door (Department of Defense Biometric Authoritative) or other databases carry out near real-time or real When data communication.In embodiments, algorithm can be used to come for related with fingerprint and palmmprint, iris and face-image for equipment Detection, analysis etc..In embodiments, equipment can illuminate iris or latent fingerprint for integration analysis simultaneously.In each implementation In example, high-speed video is can be used to capture specific image under unstable situation and intuitive tactics display can be used to promote in equipment The fast propagation that situation knows.Real-time situation can be provided to command post and/or tactics operation center to know.In each implementation In example, equipment allows each soldier to become a sensor and observes and report.Collected data available dates, time and receipts The geographical location of collection marks.In addition, biometric image, which can be, meets NIST/ISO, including ITL1-2007.In addition, In embodiments, laser range finder can assist bio-identification to capture and position target.Threaten library that can be stored in airborne mini In SD card or remotely load in tactical network.In embodiments, band transceiver and/or ultra wide band can be used to receive for equipment Device is sent out wirelessly to transmit encrypted data between devices.Equipment can be for embedded database or safely in battlefield net Database on network executes the airborne matching to potential POI.In addition, equipment can use high-speed video under all environmental conditions To capture specific image.Bio-identification profile can be uploaded in the time, download and search in a few seconds or less.In each embodiment In, equipment can be used to utilize visual bio-identification geo-location POI at safe distance in user, and with for face, iris Deng steady sparse recognition algorithm identify POI for certain.In embodiments, user is combinable and by visual bio-identification Feature is printed upon one and shows that, without reminding the POI, which there is enhancing target to highlight and view comprehensively Match and alerts.Such display can be in various equipment, eyepiece, handheld device etc..
In embodiments, when native country personage is when controlled checkpoint and/or station are by screening, operator be can be used Unobtrusive face and iris biometric feature come collection, registration, mark and verifying POI from audit listing.Each In embodiment, bio-identification is collected and mark can be carried out in scene of a crime.For example, in explosion or other scenes of a crime, operator Biometric data can be rapidly collected from all potential POI.Data can be collected, GEOGRAPHICAL INDICATION and be stored in digital archives In to be compared for passing and future scene of a crime.In addition, in house and building search, it can be in real time from POI Collect biometric data.Shown such data can allow operator to know release, detain or arrest potential POI.In other embodiments, unobtrusive data collection and mark can carry out in street environment etc..User can be such as Through the market is simultaneously assimilated with Local resident, while biometric feature, geographical location and/or ring are collected under minimum visible influences Border data.In addition, biometric data can be collected with the dead or the wounded to identify whether they are POI.In each embodiment In, user can be by the dead or the wounded or other people face identification, iris identification, fingerprint, visible mark label etc. To identify known or unknown POI, and the update to electronic record is kept with such data.
In embodiments, laser range finder and/or inclinometer can be used to determine concern personage and/or improvised explosive dress It sets, other positions for paying close attention to article etc..Various equipment described herein may include digital compass, inclinometer and laser range finder with The geographical location of POI, target, IED, concern article etc. is provided.POI and/or the geographical location for paying close attention to article can be by networks, war Art network etc. is sent and such data can be shared between individuals.In embodiments, equipment allows optical array With laser range finder in uncontrolled environment using in battlefield group or crowd be observed continuously come simultaneously to multiple POI carries out geo-location and ranging.In addition, in embodiments, equipment may include laser range finder and specified device with by pair Being observed continuously for one or more targets simultaneously to carry out ranging and paint to target.In addition, in embodiments, equipment can Be soldier wear, it is hand-held etc., and including using integrated laser range finder, digital compass, inclinometer and GPS receiver Target geographic position position the enemy on battlefield.In embodiments, equipment may include integrated digital compass, inclination Meter, MEMs gyroscope and GPS receiver record and show the position of soldier and the direction of his sight.In addition, various equipment It may include for the integrated GPS receiver of position precision and directional precision etc. or other GPS receivers, IMU, 3 number of axle word sieve Disk or other compass, laser range finder, gyroscope, the gyroscope based on micro machine system, accelerometer and/or inclinometer.? The various device and method of this description aloow user to position the enemy in battlefield and POI, and pass through network or other sides The shared such information of formula and friendly troop.
In embodiments, it can be used communication together with geographical location is by the netted networking of user or networking.In addition, each User can be provided that the pop-up window or other positions map of all users or adjacent user.This can be provided a user about friendly troop Knowledge positioned at where.As described above, the position of enemy can be found.The position of enemy can be tracked and pop-out can be used Mouthful or the other positions map of enemy provide, can provide a user the knowledge for being located at where about friendly troop.Friendly troop and enemy Position can be shared in real time.User can be provided that the map for describing such position.About friendly troop, enemy position and/ Such map of quantity and their combination be displayed in the eyepiece of user or other equipment in for checking.
In embodiments, equipment, methods and applications, which can provide, is not necessarily to hand, wireless, visually maintenance and repair And/or the instruction of audio enhancing.Such application may include the RFID sensing for part positioning and relevant equipment.In each example In, equipment can be used to carry out the spot repair instructed for augmented reality for user.Such spot repair can by without with hand, Wireless, maintenance and repair indicate to instruct.It eyepiece, projector, monocular etc. and/or other described herein sets The standby equipment waited can show the image of maintenance and repair process.In embodiments, such image can be it is static and/or Video, animation, 3-D(is three-dimensional), 2-D(it is two-dimensional) etc..In addition, user can be provided that the voice about such process And/or audio annotation.In embodiments, this application can use in high threat environment, wherein the work not being detected It is a security consideration.Augmented reality image and video can be projected to or cover in other ways user working it is true On object or in the user visual field of object, to provide video, diagram, text or the other instructions of the process to be performed.Each In embodiment, the program library for various processes can be from the computer worn with it or from remote equipment, database and/or service It wiredly or wirelessly downloads and accesses in device etc..Such program can be used for really maintenance or training goal.
In embodiments, equipment, method and the description found herein can be used for inventory status notification system.In each embodiment In, such tracking system allows the scanning from most 100 meters of distance to use 2mb/s data transmission rate to handle more than 1000 Synchronization link.When checking inventory and/or near inventory, which can provide annotated sound related with inventory tracking Frequency and/or visual information.In embodiments, equipment may include eyepiece described herein, monocular, binoculars And/or other equipment, and the wired or nothing that inventory tracking can be used SWIR, SWIR color and/or night vision technology, wear with it Computer on line, wireless UWB safety label, RFID label tag, the helmet/safety cap reader and display etc..In embodiments, and And only as an example, user can receive visual and/or audio-frequency information related with inventory, such as which article is wanted destroyed, is turned Move, the quantity of destroyed or transfer article, article to be transferred or be discarded into where etc..In addition, such information can be with Highlight or provide in other ways visible mark and the instruction of article in question.Such information is displayed at On the eyepiece of user, projects on article, is shown on number or other displays or monitor etc..Article in question can It is labeled by UWB and/or RFID label tag and/or augmented reality program can be used for providing a user visualization and/or instruction So that various equipment described herein can provide for information needed for inventory tracking and management.
In various embodiments, when fire fighting, SWIR, SWIR color described herein, monocular, night vision, with Wireless computer, eyepiece and/or the equipment described herein of wearing can be used.In embodiments, user can have transmission The promotion visibility of smog, and the position of each individual can be shown in covering map or other maps by the equipment of user To user, so that his knowable fireman and/or other people position.Equipment can provide the real-time of the position of all firemans It has been shown that, and provide to the Hot spots detection with the region below and above 200 degree celsius temperatures, without triggering false alarm.Facility Map can also be provided, be displayed in equipment by equipment, being projected from equipment, and/or by augmented reality or other means It is covered in the sight of user, to help that user is instructed to pass through structure and/or environment.
System and equipment described herein can be configured for any software and/or algorithm to meet task specific needs And/or system upgrade.
With reference to Figure 73, eyepiece 100 can be docked with " bio-identification flash lamp " 7300, " bio-identification flash lamp " 7300 It such as include the biometric signature and form factor function and with typical handheld flash lamp for recording individual Biometric data obtain sensor.Bio-identification flash lamp can be docked directly with eyepiece in the following manner: such as logical Cross be wirelessly connected directly from bio-identification flash lamp to eyepiece 100 or the embodiment as indicated in Figure 73 in show, pass through The intermediary transceiver 7302 wirelessly docked with bio-identification flash lamp and by wired or wireless interface from transceiver to mesh Mirror (for example, wherein transceiver apparatus is wearable on such as waistband).Although depicting other mobile biologies in the accompanying drawings to know Other equipment skilled artisans appreciate that may make appointing in mobile biometric apparatus without showing transceiver One is communicated indirectly with eyepiece 100 by transceiver 7300, is directly communicated with eyepiece 100 or individually operated.Data can Eyepiece memory, the memory in transceiver apparatus are transmitted to, as one, bio-identification flash lamp from bio-identification flash lamp The memory etc. in mobile memory card 7304 divided.As described in this, bio-identification flash lamp may include integrated camera And display.In embodiments, bio-identification flash lamp is used as stand-alone device and does not have to eyepiece, and wherein data are internal Ground stores and information is provided over the display.In this way, civilian personnel can be easier and safely using life Object identification blink lamp.Bio-identification flash lamp can have the range for capturing certain form of biometric data, such as 1 Rice, 3 meters, 10 meters etc. of range.Camera can provide monochromatic or color image.In embodiments, bio-identification flash lamp can mention Flash lamp-camera is collected for hidden biometric data, rapidly geo-location, monitoring and environment and biology can be collected Identify data for the identification matching of airborne or remote biometric.In an example usage scenario, soldier can be assigned to the whistle at night Institute.Soldier can only come as typical flash lamp using bio-identification flash lamp, but wherein unwitting individual is set It is standby to illuminate, and also bio-identification is run and/or obtained as a part of data collection and/or bio-identification identification procedure Feature.
Referring now to Figure 76,360 ° of imagers are recessed into using number is small as will arrive any given region in set of pixels, To deliver the high-definition picture in the specified region.Each embodiment of 360 ° of imagers can with the small recessed visual field of ultrahigh resolution and It is simultaneously and 10x(10 times independent) optical zoom characterize continuous 360 ° × 40 ° visuals field panorama FOV().360 ° of imagers May include two 5,000,000 element sensors and 30fps(number of pictures per second) imaging capability and < 100 image acquisition time.360° Imager may include the gyro-stabilized platform with independently stable imaging sensor.360 ° of imagers can only have a shifting Dynamic part and two imaging sensors, this allows reduced image processing bandwidth in compact Optical System Design.360 ° at As device can also characterize low angular resolution and high-speed video handles and to can be sensor unknowable.360 ° of imagers are used as In an equipment, in the move vehicle with gyro-stabilized platform, be mounted on traffic lights or phone mast, robot, Supervision equipment on aircraft or other positions for allowing persistently to monitor.Multiple users can independently and simultaneously check by 360 ° The environment of imager imaging.For example, being displayed in eyepiece by the video that 360 ° of imagers capture to allow all of data There is recipient (all occupants in such as combat vehicle) real-time 360 ° of situations to know.360 ° of imagers of panorama can be recognized People at 100 meters and recessed 10x(10 times small) zoom can be used to read the licence plate at 500 meters.360 ° of imagers allow to ring The lasting record in border simultaneously characterizes independently controllable small recessed imager.
Figure 76 A depicts the 360 ° of imagers assembled and Figure 76 B depicts the cross-sectional view of 360 ° of imagers.360 ° of imagings Device includes capturing mirror 7602, object lens 7604, beam splitter 7608, lens 7610 and 7612, MEMS mirror 7614, sensor total field of view 7618, panoramic picture lens 7620, folding mirror 7622, central concave transducer 7624 and spill image lens 7628. It is can be geo-location with the image that 360 ° of imagers are collected and be subject to time and date label.Other sensors can quilt It is included in 360 ° of imagers, thermal imaging sensor, NIR sensor, SWIR sensor etc..MEMS mirror 7614 is only One captures the reflecting prism of system using single view hemisphere face, to allow high and balanced resolution ratio.Imager design makes < 0.1 ° of scanning accuracy, < 1% small recessed distortion, the 50%MTF at 400lp/mm and < 30 millisecond must be can be realized Small recessed acquisition.
360 ° of imagers can be a part that the network of TOC or database is traced back to wireless or physics.For example, with The display with 360 ° of imager drivers can be used wirelessly to check the image from 360 ° of imagers in family, or uses Wired connection (such as military control type cable) checks the image from 360 ° of imagers.Display can be fight radio and set Netted Net-connected computer that is standby or networking with general headquarters.Data from database (such as Ministry of National Defence's authoritative database) can be by fighting Wireless device or the computer of netted networking such as by using removable memory storage card or pass through networking connected reference.
Referring now to Figure 77, consistent more visual field cameras can be used for being imaged.Feeding from consistent more visual field cameras can quilt It is sent to eyepiece 100 or any other suitable display equipment.In one embodiment, consistent more visual field cameras can be completely Connect, 3- or the consistent visual field 4-, SWIR/LWIR imaging and target appointing system, allow simultaneously: wide, medium and narrow view Open country monitoring, wherein there is each sensor VGA or SXVGA resolution ratio to be used for daytime or nighttime operation.It is light weight, solid with gimbal Fixed sensor array can be by inertially stable and geographically refer to, so as to utilize its NVG compatible laser indicator Ability is specified come the sensor positioning for realizing pin-point accuracy under all conditions and target.It is unique multiple while the visual field can Wide area monitoring is realized in visible, near-infrared, short-wave infrared and LONG WAVE INFRARED region.When with from digital compass, inclinometer and When the output coupling of GPS receiver, it also allows high-resolution Narrow Field Of Vision for the more accurate of point of use to mesh coordinate Target identification and specified.
It is all there may be separated, steerable, the consistent visual field in one embodiment of consistent more visual field cameras Such as 30 °, 10 ° and 1 °, there is automatic POI or multiple POI tracking, face and iris recognition, airborne matching and by 256 AES encryption UWB come with laptop computer, combat radio or it is other networking or netted networked devices wirelessly communicate.Phase Machine can be networked to CP, TOC and biometric data library, and may include 3 axis, gyrocontrol, high dynamic range, high-resolution Rate sensor delivers the ability in terms of under conditions of from dazzling sunlight to extremely low light.Identifier (ID) can immediately be existed Locally or in long-range storage stores and analyze.Camera can characterize the accurate geo-location of " find and position " of POI and threat To > 1,000 meter of distance, 1550 nanometers of integrated eye safety laser range finders, the GPS of networking, 3 axis gyroscopes, 3 axis magnetic force Meter, accelerometer and inclinometer, electronic image enhancing and increase electronic stability assist tracking, record full motion (30 frames are per second) color Color video, ABIS, EBTS, EFTS and JPEG2000 are compatible and meet MIL-STD810 for operating in extreme circumstances.Phase Machine can be installed via the ball system equipped with universal joint, should be used to isolate bio-identification equipped with the ball system integration of universal joint The disoperative bio-identification collection of the movement of capture scheme and mark and laser ranging and POI geo-location, are such as blocking At point, checkpoint and facility.Multi-mode living things feature recognition includes collecting and identifying face and iris and record video, gait With other distinguishing labels or movement.Camera may include to all POI and collecting data with time, date and position come ground Manage the ability positioningly marked.Camera promotes the fast propagation that the situation to the squad CP and TOC for enabling network knows.
In another embodiment of consistent more visual field cameras, camera characterization provides consistent 20 °, the 7.5 ° and 2.5 ° visuals field 3 separated, colored VGA SWIR electro-optic modules, and point out with precision in ultra-compact configuration pair for extensive region 1 LWIR thermoelectricity optical module of the imaging of POI and target.3 axis, gyrocontrol, high dynamic range, colour VGA SWIR phase Machine delivering is seen under conditions of from dazzling sunlight to extremely low light and through mist, cigarette and without " in full bloom " haze Ability.It can be by the way that 3 axis gyroscope of micro machine system (MEMS) and enhancing GPS receiver and 3 axis of magnetometer data be accelerated Device is integrated to obtain geo-location.1840 nanometers of integrated eye safety laser range finders and target specify device, GPS receiver " find and position " is provided, to the accurate geo-location to 3 kilometers of distances of POI and threat with IMU.Camera show and it Full motion (30 frames are per second) color video is stored in " camcorders in chip ", and is stored it in solid-state, moved in driving For the remote access during flight or for postoperative inspection.Electronic image enhancing and increase electronic stability are assisted to POI Tracking, geo-location ranging with target and specified.Feeding of the eyepiece 100 by display from consistent more visual field cameras as a result, To deliver " vision " that is not blocked of threat.In some embodiments of eyepiece 100, display sensor is also can be used in eyepiece 100 Image, moving map and data " perspective ", turn over/under the electric light indication mechanism that turns over not being obstructed for soldier oneself weapon is provided The view hindered.In one embodiment, turn over/under the electric light indication mechanism that turns over can be caught in the MICH or PRO-TECH of any standard In the NVG installation of the helmet.
Figure 77 depicts an embodiment of consistent more visual field cameras, including laser range finder and specified device 7702, it is complete in it is anti- Penetrate mirror 7704, mounting ring 7708, total internal reflection mirror 7710, total internal reflection mirror 7714, antireflection honeycomb ring 7718, 1280x1024SWIR380-1600 nano-sensor 7720, antireflection honeycomb ring 7222,1280x1024SWIR380-1600 receive Rice sensor 7724, antireflection honeycomb ring 7728 and 1280x1024SWIR380-1600 nano-sensor 7730.Other implementations Example may include additional TIR lens, FLIR sensor etc..
With reference to Figure 78, flight eyes (flight eye) is depicted.Feeding from flight eyes may be sent to that eyepiece 100 or any other suitable display equipment.Flight eyes may include being mounted in the folding imager array with multiple FOV Multiple individual SWIR sensors.Flight eyes are monitoring and the target appointing system of low-profile, which can be using often A sensor is in VGA to SXGA resolution ratio, at daytime or night, realizes in single low-latitude flying through mist, cigarette and haze pair The consecutive image in entire battlefield.The selectivity that its modularized design allows to any element from 1 ° to 30 °, fixed point Resolution changes, for dolly-out,ing dolly-back to the wide-angle image in any region of array.The resolution ratio of each SWIR imager is 1280x1024 is simultaneously sensitive at 380-1600 nanometers.Abatement is heavy together and automatically by all images " suture " for more DSP array boards Fold-over element is for seamless image.Consistent 1064 nanometer laser specifies device and rangefinder 7802 that can consistently pacify with any imager Dress, without stopping its FOV.
With reference to Figure 106, eyepiece 100 can be with eyepiece Application development environ-ment 10604 with the software inhouse application 7214(of eyepiece Develop in association) it operates in combination, wherein eyepiece 100 may include being suitble to project image onto perspective or translucent lens Projection facility so that the wearer of eyepiece can check ambient enviroment and as by software inhouse using 7214 provide Shown image.May include memory and operating system (OS) 10624 processor can main memory software inhouse using 7214, Docking between control eyepiece order and control and software application, control projection facility etc..
In embodiments, eyepiece 100 may include calculating the main memory software inhouse run on facility 7212 in multimedia to answer With 7214 operating system 10624, wherein internal applications 7214 can be the software application developed by third party 7242 and quilt It provides with for download to eyepiece 100, such as from application shop 10602,3D AR eyepiece application shop 10610, from the third of networking Square application server 10612 etc..Internal applications 7214 such as can pass through input equipment 7204 with API10608 in combination, outside is set Standby 7240, external 10630 facilities of facility 7232, the order of eyepiece and control etc. that calculate control 10634 mistake for the treatment of facility with eyepiece Journey interaction.Internal applications 7214 connect such as internet 10622(by network communication, Local Area Network, have other eyepieces With the mesh network, satellite communications link, cellular network of mobile device etc.) and eyepiece 100 can be used.Internal applications 7214 can It is bought by application shop (application shop 10602,3D AR eyepiece application shop 10610 etc.).Internal applications 7214 can It is provided by 3D AR eyepiece shop 10610, such as applies 7214 for the software inhouse of 100 specific development of eyepiece.
Eyepiece Application development environ-ment 10604 can be used to create to software developer new eyepiece application (for example, 3D is answered With), modification application substantially create the new 3D application version applied substantially etc..Eyepiece Application development environ-ment 10604 can wrap 3D application environment is included, which is adapted the application once completed and is loaded on eyepiece or in other ways to mesh The available access just provided to developer to available on eyepiece control scheme, UI parameter and other specifications of mirror.Eyepiece can The API10608 of communication between application and eyepiece computing system including being designed to promote completion.The exploitation ring of developer Domestic application developer can then pay close attention to exploitation have specific function application, and do not have to be concerned about themselves how with mesh The details that mirror hardware interacts.API also may make developer more directly to modify existing application and answer to create 3D With for being used on eyepiece 100.In embodiments, internal applications 7214 can utilize the server 10612 of networking to be used for Client-server configuration, (for example, by internal applications 7214, partly local runtime exists mixed-client-for server configuration It is on eyepiece 100 and partially operational on application server 7214), will using complete main memory on the server, under server Carry etc..Network data storage 10614 can be provided in association with internal applications 7214, such as further with application server 10612, the application etc. of purchase is provided in association with.In embodiments, internal applications 7214 can with sponsor facility 10614, Market 10620 etc. interacts, and all Tathagata provides the advertisement of patronage, to eyepiece in combination with the execution of internal applications 7214 100 user provides marketplace content etc..
In embodiments, software and/or application can have been developed to be used together with eyepiece or supplement eyepiece.For mesh The application of mirror can be developed via Open Source Platform, closing source code platform and/or software development kit.For developing answering for eyepiece Software development kit and the software therefrom developed can be open source or closing source code.Using can have been developed to Android, Apple, other platforms etc. are compatible.Using can be by application shop associated with eyepiece, independent application shop etc. To sell or therefrom download.
For example, the integrated processor of eyepiece can run at least one software application and process content is shown with feeding to user, And content can be introduced into the optics assembly of eyepiece by integrated image source.Software application can pass through the control and sensing with eyepiece Interactive 3D content is supplied to user by the interaction of at least one of device facility.
In embodiments, eyepiece can be used for various applications.Eyepiece can be used for consumer's application.Only as example and Non- offer detailed bill, eyepiece can be used for it is following application or it is used therewith: travel application, educational applications, Video Applications, Exercise application, personal assistant applications, augmented reality application, search application, local search application, navigation application, film applications, face Portion's identification application, venue identifier application, the application of character recognition and label symbol, text application, instant message transrecieving application, Email are answered With the application of, item to be done, social networking application etc..Social networking application may include the application such as Facebook, Google+. In embodiments, eyepiece can be used for entertainment applications.As example rather than exhaustive list is provided, eyepiece can be used for following answer With or it is used therewith: charging application, customer relation management application, business intelligence application, human resource management application, list Automation application, office products application, Microsoft Office etc..In embodiments, eyepiece can be used for industry and answer With.As example rather than exhaustive list is provided, eyepiece can be used for following application or used therewith: product of the future quality Planning software application, product parts approval software application, statistical data process control application, professional training application etc..
With reference to Figure 107, eyepiece Application development environ-ment 10604 can be used for can be presented to application shop 10602,3D The exploitation of the application of AR eyepiece application shop 10610 etc..Eyepiece Application development environ-ment 10604 may include user interface 10702, right The access etc. of control program 10704.For example, developer can using in user interface menu and dialog box access controlling party Case 10704 is for selection, therefore scheme may be selected in application developer.The template that developer can select general operation to apply Scheme, but can also have be selected for can application execution sometime cover template function scheme various function The single control of energy.Developer can also have the visual field (FOV) to control using user interface 10702 with control program exploitation The application of (such as passing through the interface FOV).The interface FOV may be provided in the FOV of two displays (for each eyes) of display and single The mode mediated between display.In embodiments, it can be designed in single display view for the 3D application of eyepiece, this is Because API10610 will provide the explanation for determining which display is used for which content, although developer perhaps can be for spy Determine the specific eyes of content selection to show.In embodiments, developer manually can such as pass through user interface 10702 What will show in each eyes to select and/or check.
Eyepiece can have the software stack 10800 as described in attached drawing 108.Software stack 10800 can have wear-type hardware and Software platform layer 10818, to podium level interface-API- packaging 10814, for 10812 layers of the library of exploitation, application layer 10801 etc..Application layer 10801 can include in turn consumer using 10802, entertainment applications 10804, industrial application 10808 and Other similar applications 10810.It is arrived in addition, hardware 10820 associated with the execution of internal applications 7214 or exploitation is also combinable In software stack 10800.
In embodiments, user experience can be that focus is aligned, simultaneously by ensuring to enhance image relative to ambient enviroment And display is arranged on brightness appropriate to optimize in the case where given ambient light and the content being shown.
In one embodiment, eyepiece optics assembly may include delivering content in a manner of three-dimensional for each eyes Electro-optic module, also referred to as show.In some cases, three-dimensional view is not ideal.In embodiments, for specific interior Hold, only one display can be opened or only one electro-optic module can be included in optics assembly.In other embodiments In, the brightness of each display can be changed, so that brain ignores darker display.The auto brightness control of image source can basis Brightness in environment controls the brightness of displayed content.The speed that brightness changes may depend on the change in environment.Brightness changes The speed of change can be matched with the adjustment of eyes.Display content can close a period of time after the sudden change of ambient brightness.Display Content can be dimmed with the dimmed of environment.Display content can brighten with brightening for environment.
When entering dark environment from bright environment, human eye needs a period of time to adapt to dark.In this period of time Period, eyes only have the limited visibility to dark surrounds.In some cases, such as under safety or law enforcement situation, energy Enough being moved to dark environment from bright environment and capable of quickly determining what activity or object is important in dark environment 's.However, the eyes of people, which fully adapt to dark environment, needs at most 20 minutes.During between at this moment, view of the people to environment Power is compromised, this can lead to dangerous situation.
In some cases, the strong luminous energy such as flash lamp be used to illuminate dark environment.In other cases, into The eyes of people are covered into a period of time to allow eyes partly to fit before entering dark environment before entering dark environment It is possible for answering dark environment.However, cannot be used in strong light in dark environment and enter dark environment it Before to cover the eyes of people be in infeasible situation, it is desirable to provide assist the method checked to reduce in the tour from bright to dark Between people the eyesight damaged time.
Nigh-vison googles and binoculars are known for providing the image of dark surrounds.However, these equipment provide Thus the image of constant brightness does not simultaneously allow the eyes of user to adapt to dark, so equipment must be continuous in dark environment Ground uses.As a result, these equipment do not utilize people can be well in dark ring after their eyes adapt to dark completely The fact that checked in border.
United States Patent (USP) 8094118 provides the brightness for correspondingly adjusting display with the brightness of ambient enviroment to save The method of electric power.This method is directed to the display brightness experienced, and not with the eyes of user from bright to the environment of dark Adjustment in transformation is related.In addition, this method does not assist user to check environment.
Therefore, it is necessary to a kind of methods to assist people from bright ring during the eyes of people adapt to dark a period of time The method that border is moved to dark environment.
Head-mounted display device with see-through capabilities provides the clear view of scene before user and at the same time also providing Show the ability of image, wherein user sees the combination picture being made of the displayed image of see-through view and covering.This hair Bright disclosure provides the method for the assistance view for providing environment when user is from bright environment transition to dark environment.It should Method rapidly adjusts capturing condition using the camera on head-mounted display device, so that the image of dark surrounds can be caught It catches and image is displayed to user.The brightness of displayed image is progressively decreased so that the eyes of user adapt to dark Environment.
Figure 154 (from by Davison, H., data, " the The Eye " of Hecht and Mandelbaum in the books of editor (" eyes ") volume Two, Academic Press London, 1962, in the 5th chapter " dark adaptation and night vision " that Pirene, M.H. write Obtain) show the chart for being directed to the typical black dark adaptation curve of human eye, wherein stamping grinding for the region expression 80% of shade Study carefully subject population.In this chart, curve is shown in the minimal illumination that can be observed at specific time, the specific time with Start and immediately enter dark surrounds at the time 0 under strong light environment, wherein the minimal illumination that can be observed is to pass through The luminous point of different illumination on a region is shown in people and the people reports the luminous point that can be seen after adusk different time Come what is determined.It can such as find out from the curve, human eye adjusts with the time, so that the luminous point compared with low-light (level) can be with about 25 minutes a period of time is progressively seen.As annotated in the chart in Figure 154, actually facilitate there are two types of mechanism The dark adaptation of human eye.The cone (also referred to as photopic vision) in eyes compared to relatively slow adjustment rhabodoid (also referred to as For noctovision) it is quickly adjusted at brighter condition.As a result, for adjusting being moved in darker condition from brighter condition The suitable time, which can be spent, the how dark quite a long time depending on environment.During dark accommodation time section, people can be approached Then it becomes blind.
Table 2 is provided for general illumination condition with Lux(lux) and lambert Lambert() two units Typical brightness value.The range of illumination for external lighting conditions is in bright sunlight and the not no span at the cloudy night of the moon More 9 orders of magnitude.Also the brightness value for interior lighting condition is provided for comparing.
Table 2 shows that coming from websitehttp://www.engineeringtoolbox.com/light-level- rooms-d_708.htmlTypical illumination level.
Typical illumination level Lux Lambert
Sunlight 107527 10.7527
Bright and beautiful daylight 10,752 1.0752
Cloudy day 1075 0.1075
Very dark day 107 0.0107
Dusk 10.8 0.00108
Depth dusk 1.08 0.000108
Full moon 0.108 0.0000108
Crescent 0.0108 0.00000108
Starlight 0.0011 0.00000011
Cloudy night 0.0001 0.00000001
Supermarket 750 0.075
Common office 500 0.05
Classroom 250 0.025
Warehouse 150 0.015
Dark public domain 35 0.0035
Table 3 provides the brightness experienced when the lighting condition adapted to completely from an eyes changes to compared with dark condition Value (has brightness (Bril) unit).The change of shown lighting condition and the illumination provided according to the lambert from table 2 It is worth related.It is the explanation of brightness in the example that 3 bottom of table provides, wherein the brightness for 1 brightness experienced is about by crescent In the brightness that sunny night or 0.000001 lambert provide, wherein by the brightness experienced in human visual system and lighting condition In the relevant equation of change provided in United States Patent (USP) 8094118, the equation below be used as equation 3 come provide for ginseng It examines.
B=λ (L/La) σ equation 3
Wherein
σ=0.4log10 (La)+2.92
λ=102.0208La 0.336
Table 3 show other examples in, it is readily seen that many situations encountered in real life cause wherein according to The change of bright condition causes the dark situation experienced.The change of various lighting conditions is shown in table 3 and when this changes Become the brightness experienced when occurring for the first time.In many examples in these examples, it is moved to when for the first time from bright condition darker The brightness experienced when condition is significantly less than in the complete brightness provided by crescent for adapting to experience after dark.It is mobile from daylight Public domain to warehouse or dark is especially problematic, and eyes are substantially to have become blind a period of time until it becomes to adapt to newly Lighting condition.Invention described herein provide it is a kind of for from bright conditional transition to during darker condition, Eyes are adapting to the method for assisting human eye when darker condition.
Table 3 show the luminance level experienced when changing from bright environment to dark environment using equation 3 and Brightness value from table 2:
Figure 155 is provided about from Spillman L., " there are the moons for the paper of Nowlan A.T. and Bernholz C.D. The dark that the thanks to dark adaptation under background illuminance " (magazine of Optical Society of America, volume 62, the 2nd phase, 2 months 1972) uses The measurement data of the speed of adaptation.Figure 155 is shown for measuring increment threshold with the log background illuminance of linearly reduction Value.Background 3.5 minutes (), 7 minutes (Δ), 14 minutes (zero), 21 minutes (◇) and 7 log are changed in 3.5 minutes (logarithm) unit, without pre-exposure (■).The time of arrow instruction background delustring.In the case where lacking any background illuminance (×) The common dark threshold value of record is most of consistent with precipitous background slope and omits after becoming invariant.
Data in Figure 155 are that institute's speed based on measurement can be detected under the measured speed by human eye Speck on minimal illumination horizontal (threshold value) with when lighting condition is from 0.325 lambert (cloudy day of part) to complete darkness Eyes become to adapt to dark (more sensitive) and progressively reduce.The different curves shown in the chart of Figure 155 are for wherein (rather than the change immediately shown in Figure 154) lighting condition that change from bright to dark is completed with different linear velocities.In chart The more quickly adaptation to dark under conditions of the change shown in from bright to dark of the curve in left side quickly completes.Such as What data that are showing in Figure 154 and being shown by Figure 155 were supported, when directly from it is bright be moved to complete darkness when adapt to dark Typical time period is about 15 minutes.Brightness, which is shown, in data in Figure 155 linearly to change with 14 minutes a period of time Become and only there is the data display in the time-related small loss of dark adaptation, Figure 155 to adapt to the dark time from being directed to 15 minutes changed immediately increase to 19 minutes changed for 14 minutes slopes.The present invention there is provided herein for dark The displayed image of environment provides the method with the brightness with time progressive reduction, so user is provided dark surrounds Observable image simultaneously still allows the eyes of user to adapt to dark environment simultaneously.This method use quickly adjusts to dark ring Border so that its image that can capture dark surrounds camera.The image captured is supplied to use on perspective head-mounted display Family, wherein the brightness of image was progressively reduced with the time, thus the eyes of user can adapt to dark and user can be with saturating Environment is progressively seen depending on the see-through capabilities of head-mounted display.
Figure 156 is the diagram with the head-mounted display apparatus 15600 of see-through capabilities.Head-mounted display apparatus 15600 wraps See-through display 15602, one or more cameras 15604 and electronic device 15608 are included, wherein electronic device 15608 can include One of the following is multiple: processor, battery, GPS sensor (GPS), direction sensor, data storage, channel radio Letter system and user interface.
In one embodiment, the head-mounted display apparatus 15600 at least one camera 15604 or 15610 is used for The enhancing view of the dark surrounds is provided in see-through display 15602 during the eyes of user are adapting to dark surrounds. Camera 15604 or 15610 can quickly automatically adjust trap setting using auto exposure system, such as gain, ISO, Resolution ratio or pixel binning (binning).In some embodiments, the camera lens of camera 15604 or 15610 is changeable with energy The picture catching promoted in dark surrounds.The brightness for being shown in the image in see-through display 15602 can be with the time Be adjusted to matching eyes debugging and can light-sensitive material associated with head-mounted display apparatus 15600 any change. In this way, the light-sensitive material quickly changed is not needed.Transit time with the minute order of magnitude becomes clear light-induced variable Color material is well adapted for various embodiments of the present invention.Under any circumstance, the visual field of shown ambient image should be with head The visual field matching that formula shows equipment 15600 is worn, to provide the display for the dark surrounds for being easy docking in augmented reality mode.
The present invention provides the head-mounted display apparatus 15600 with one or more cameras 15604 or 15610, wherein institute The image of scene can be shown with the time with the brightness in a range before the user of capture.It is adapted to compared to eyes of user Ambient brightness variation, camera 15604 or 15610 and their associated auto exposure systems much more quickly to adapt to environment bright The variation of degree, usually in 1 second.In one embodiment, camera 15604 or 15610 captures the image of scene before user, and When the brightness of scene rapidly from it is bright dimmed when, the image of captured scene is displayed to user in see-through display 15602. The brightness of displayed image is lowered so that with the time and is and then moved to after dark surrounds i.e. becoming clear to user's scene Image, and brightness is then reduced with the time by the dark rate for allowing the eyes of user to adapt to environment.Figure 157 show with The time be supplied to user displayed image brightness chart, wherein t1 be when the brightness of environment from it is bright dimmed when Between.To the capture of ambient image can at time t1 or before start.After t1, when the brightness of displayed image is reduced until Between t2, in time t2 user eyes adapt to dark surrounds.After time t 2, the brightness of displayed image is maintained at a water Put down constant, wherein user can be with perspective mode environment of observation.In other embodiments of the invention, shown after time t 2 to scheme The brightness of picture is 0, so that user only observes dark surrounds with perspective mode.In further embodiment of the present invention, show The picture material of diagram picture enhances after t2 from the image modification of environment before the user captured to other images or such as existing The information of real information (for example instruction or direction) etc..In another embodiment of the invention, if environment is secretly in predetermined water Flat, then the brightness of shown ambient image is lowered to the level maintained after time t 2, so that the version of night vision is provided, Middle night vision makes a response the quick change of ambient lighting, and also when condition is too dark for eyes adapt to task at hand Longer-term night vision is provided.Wherein after time t 2 provide night vision imaging dark level can by user operation mode setting in Selection, wherein need the task that more details in environment are detected using provided during night vision mode it is brighter shown by Ambient image setting.
In the preferred embodiment, the brightness of shown ambient image is with the rate with eyes of user adaptation dark surrounds Corresponding rate reduces, such as corresponding with the curve shown in Figure 155 from bright image to dark images or without figure It converts 14 minutes of picture.In this way, when eyes of user adapts to dark, which is temporarily provided ambient image, But adapt to time of dark surrounds with no displayed image the case where under time for adapting to compare it is significant Extend.
In further embodiment of the invention, when user enters dark surrounds, camera 15604 or 15610 it is saturating Mirror is varied to provide the low light image capturing ability promoted.In this case, camera 15604 or 15610 or electronic device Another photolytic activity detector in 15608 detects the change from bright light environments to dark surrounds, and wherein the brightness of environment is by electronics Automatic exposure sensor in device 15608 or the picture by detecting the imaging sensor in camera 15604 or 15610 The reduction of plain code value detects.The lens of camera 15604 or 15610 are then varied to promote light capacity gauge or make phase Machine 15604 or 15610 can capture infrared image.Example: light capacity gauge can by change to lower f# lens come It is promoted.Example: infrared image capture can be realized in camera 15604 or 15610 in the following manner: remove in lens assembly Infrared cutoff filter, lens element is mutually shifted to reset focal length or change the one or more in lens element At Infrared Lens element.In another embodiment, the imaging sensor of camera 15604 or 15610 is varied to achieve infrared figure As capturing.
Figure 158 shows the flow chart of a method of the invention.In step 15802, user is moved to dark from bright light environments Environment.In step 15804, another photolytic activity detector in camera 15604(or electronic device 15608) detection is in the environment Change of the lighting condition to dark condition.In step 15808, the capturing condition that camera 15604 or 15610 uses is by automatic exposure System adjusts to realize that picture catching in a dark environment, especially video image capture.In step 15810, the figure of environment As being captured by camera 15604 or 15610 and being shown in see-through display 15602 with the first luminance level, wherein shown figure First luminance level of picture be similar to immediately in environment from it is bright be changed to dark lighting condition before user regard in the perspective of environment The brightness perceived in figure.Then in step 15812, the brightness of shown ambient image is lowered with the time, so that The eyes of user can adapt to image that is dark and checking environment simultaneously.The reduction of brightness can be with the period it is linear, It or as shown in Figure 157 is nonlinear.The period being lowered with brightness of image can change with lighting condition in environment It is covert corresponding.How black had according to environment, in step 15812, the brightness of displayed image can be lowered to 0 or maintain predetermined water It puts down to provide night vision version.
Exemplary scene 1
The police that (about 1.0 lambert) works in daylight has broken Yishanmen, which, which leads to, is similar to from such as table 2 Many restaurants for showing of data in degree of darkness (about 0.0035 lambert) some dark rooms.When door is opened When, it is 0.000007 brightness or dark 10000X(10000 times of the illumination than being provided by crescent that police, which will experience dark room), such as What the data in table 3 were shown.In fact, he will not see anything in the dark room.According to the song in Figure 155 Line, police will see in the dark room anything (its be in 0.0035 lambert (0.0035 lambert= 0.54Log millilambert)) have about 1 minute before.This is dangerous situation, because the eyes of the people in the dark room are Through having adapted to dark, so they will see police.Having worn as described in this in police has camera and perspective aobvious In the case where the head-mounted display apparatus for showing device, the image of dark room can be presented to police in about 1.5 minutes, at this Between in section the eyes of police adapting to dark.After such time, police can pass through the dark room from the point of view of see-through display Between.See-through display can still be used for will instruction or other letters when the police checks the dark room by see-through display Breath is sent to the police (such as in augmented reality presentation system).Head-mounted display apparatus of the invention is mentioned to police as a result, For the instantaneous vision for being limited solely by the low light ability of camera in dark room.
As long as the visual field presented in display image is closely matched with the part in the police visual field and video image is only Fact with lag limited between capture and display, the police will easily come using only display image in the dark It is moved in room.As the eyes of police adapt to the dark room, the brightness of displayed image is reduced with the time.
Camera can be fairly standard digital camera, have the good low light operated in high ISO and binning mode Performance is down to the video imaging of part moonlight illumination level to provide.Short-wave infrared camera has visual+near infrared imaging energy The camera (such as removing the camera of infrared cutoff filter) of power can be used to provide for being down to the imaging of darker level.Such as scheming The data instruction shown in 154 and 155, under conditions of very dark, it may be necessary to be provided a user at most 25 minutes Image, the eyes of user will adapt to dark completely at the time point.
Exemplary scene 2
Door is opened in lighting house (0.025 lambert of illumination=0.40Log millilambert) internal soldier and is walked out, into tool Have in the night (0.00001 lambert of illumination=- 2Log millilambert) of full moon.It can such as find out from the number in table 3, work as scholar It is that (it when eyes adapt to completely than having with 0.000001 brightness of brightness that soldier steps into the darkness experienced when night first Dark 1000000X times of the night of crescent) complete darkness.Curve in Figure 155 shows this change for illumination, can be The eyes of soldier would require about 2 minutes before seeing object under the conditions of darker.As in previous example, due to soldier Substantial ablepsia 2 minutes, this can be dangerous situation.The present invention provides perspective head-mounted display, capture ring The image in border simultaneously displays them to soldier to eliminate the period of ablepsia.In this case, the brightness of image can be with 3-4 minutes a period of time reduce, so the eyes of soldier adapted to dark and at the end of the time, soldier can use Head-mounted display operates in perspective mode or augmented reality mode.
Moment visuality can provide on the display visual field with regard to displayed image.As the eyes of user adapt to dark condition, The transformation that perspective is checked is provided by being gradually lowered the brightness of displayed image.
The technology may be additionally used for compensation photochromic material lens associated with head-mounted display apparatus, cannot be non- It is clear often rapidly to become.
In the alternative embodiment, the image being presented to the user can be that the 2D(captured by single camera 15610 is wherein in Now identical to the image of eyes of user) or the 3D(that is captured by stereoscopic camera 15604 be wherein presented to the user the image of eyes There is provided the different perspectives of scene).As it is known to the person skilled in the art, being also possible for generating other methods of stereo-picture , such as optical field imaging is realized using the lens with division pupil or using the lens with microlens array.
Shown image can also be transferred to different color (such as red or green) to help eyes quickly to adapt to Dark, as usually realized in night vision binoculars.
In embodiments, augmented reality eyepiece (AR) of the invention, which is adapted, determines and/or compensates eyes of user It turns to.Steering be user's eyes around vertical axis while rotate to move and obtain in the opposite direction by the their own optical axis Or maintain binocular vision.When a people sees closer object, the eyes of the people are mobile inward towards nose by their own optical axis, The compound motion referred to as assembled.In order to see farther away object, the eyes of the people are by their own optical axis to moving outside nose It is dynamic, the compound motion referred to as dissipated.When the people watch attentively infinity or it is very remote when, the eyes diverging of the people is each until them From the optical axis be substantially parallel to one another.It turns to adjust with eyes adaptability and be operated together to allow a people to move in object relative to the people The clear image to the object is maintained when dynamic.Compensation is turned at virtual image (that is, AR image) (such as label or other information) When being placed adjacent to true picture or covering true picture or when the virtual image of object will be superimposed upon the true figure of object As it is upper when in the case where become important, to make the placement of virtual image correct relative to true picture.Of the invention is used for The method for turning to compensation and/or determining is described herein and is collectively referred to as forward method.
Forward method may include turning with a distance from user of the determining perpetual object from AR eyepiece with subsequent determined with the distance To angle, that is, when the eyes of user look at the object, the optical axis intersection of eyes of user is formed by angle.Steering angle is then Be used for determining correct placement of the AR image relative to the object, can before object, below or with its matched position Place.For example, the single auto-focusing digital camera with output signal is assembled in AR in first group of forward method embodiment Some in eyepiece facilitates at position, for example, in nasal bridge region or close to one of temple.The output of camera is provided to AR mesh Microprocessor in mirror and/or it is transmitted to teleprocessing unit.In any case, its is related with its auto-focusing ability Signal is used for determining the distance for the object that the user can be seen when user goes ahead and sees.The distance and eyes of user Interocular distance be used for determining to turn to and to the correct placement of the ideal virtual image of those objects (for example, label).It should be away from From and/or steering angle may be additionally used for determine virtual objects focusing level can correctly be observed by user.Optionally, about The additional information of the steering characteristic of specific user can be entered and be stored in memory associated with microprocessor and by with In the determination for adjusting steering.
In second group of forward method embodiment, some in AR eyepiece is integrated into independently of the electronic distance meter of camera Facilitate at position, for example, in nasal bridge region or close to one of temple.In these embodiments, the output of electronic distance meter with Mode identical with the output of automatic focusing camera relatively described with first group of forward method embodiment come using.
In third group forward method embodiment, AR eyepiece includes multiple distance-measuring equipments, they can be auto-focusing phase Machine and/or electronic distance meter.In multiple equipment can all be aligned because become in the same direction come determine object away from From or one or more of equipment can be differently aligned so that with other equipment about to the letter at a distance from various objects Breath is obtainable.Output from one or more of equipment with first group of forward method embodiment relatively to describe The output identical mode of automatic focusing camera input and analyze.
In the 4th group of steering Solution Embodiments, one or more distance-measuring equipments are used by manner discussed above.It is attached Add ground, AR eyepiece includes one or more eye tracking equipments, they are configured to track one or two of user's eyes Move and/or check direction.The microprocessor or may pass to that the output of eye tracking equipment is provided in AR eyepiece Teleprocessing unit.The output is used for determining the direction that user checks, and when the eye from two eyes tracks letter When breath is available, for determining the steering of eyes of user.The direction and direction information (if applicable) are then by individually It is used together using or with direction information determining from distance-measuring equipment, to determine one or more be likely to be looking at user The placement of the related one or more virtual images of a object and (optionally) focusing level.
In the 5th group of forward method, one or more distance-measuring equipments, which are directed to, to be left in the front of the user of AR eyepiece Direction.It is used in the manner described above by the distance to object that rangefinder equipment detects to show the virtual of object Image.Although he may realize that or do not recognize virtual image when user, which is going ahead, to be seen, when user with He will recognize virtual image when the direction of object related with virtual image is seen.
Calibrating sequence can be used together with any forward method embodiment.Mechanically calibrated property, electricity can be used in calibrating sequence The step-length of son calibration property, or both.During calibrating sequence, the interocular distance of user can be determined.Also, user can be asked Ask see it is a series of there are true or pseudo range range (for example, from closely to remote) a true or virtual objects, and eyes Turn to mechanically or electronically or both ground measurement.Information from the calibrating sequence then can be when AR eyepiece be used It is used for the determination that steering, focusing and/or virtual image are placed.The calibrating sequence preferably quilt when user puts on AR eyepiece for the first time It uses, but can also think to recalibrate in user will use the useful any time.By user and during calibrating sequence As long as the relevant information of the information of acquisition can be stored for specific user for example using any technology being described herein It just can be used when user himself to be identified as to it to AR eyepiece.
It should be noted that some distance-measuring equipments use distance to determine method, wherein from the received information of sensor of equipment It is mapped in space representation straight line or non-straight wire grid.The information of each section from grid is relatively determined each other Range distance.In forward method embodiment, raw sensor data, map information, the distance being computed or these is any Combination can be used for the placement and/or focusing of one or more virtual images.
It is understood that forward method embodiment includes for one in eyes of user or in eyes of user Two virtual images are placed.In some embodiments, a virtual image is provided to the left eye of user, and different virtual Image is provided to the right eye of user.This for example provides permission to one or more virtual images to an eyes, obtains simultaneously Information from another eyes is for retouching standard.In the case where wherein multiple images are placed before user, regardless of image Be it is identical be also different, place can be and meanwhile, it is in different time or staggered in time, for example, image is with one A or multiple scheduled flicker rate (for example, 30,60 and/or 180 hertz) displays, wherein when the image for right eye is not in The current image for left eye is presented and vice versa.In some embodiments, virtual image only shows the dominant eye of people, And in further embodiments, virtual image only shows the nondominant eye of people.Wherein using staggered image in time In some embodiments, the virtual image for being positioned away from the various objects at each distance of user is shown in the manner described above Show;When user in terms of the true picture of an object to the true picture of another pair elephant when, only with the object just checked The corresponding virtual image of true picture will just be seen by the brain of user.For example, by using at a high speed (for example, 30 to 60 Hertz) operation focus mechanism (variable focus lens for being such as attached to the piezoelectric actuator of LCOS or being inserted into optical path), One or more in identical or different virtual image can be placed in one in eyes of user or eyes of user In two more than one depth planes.
In some embodiments, the focal length of virtual image can be adjusted to provide a user virtual image at ideal distance Mirage.When image is just being presented to two eyes of user, such adjusting is particularly useful, and two images Opposed lateral positions are conditioned to turn to.This adjusting can be realized for example, by following manner: adjust the optical path that image is shown Length or using one or more variable lens, this can be for example by promoting or reducing LCOS panel come of the invention some It is realized in embodiment.
In embodiments, the present invention is provided to provide Depth cue by augmented reality virtual objects or virtual information Method, broad range of perceived depth can be passed to special with different eyes by the augmented reality virtual objects or virtual information The individual of the broad range of sign.These Depth cue embodiments of the method for the invention use the increasing for being supplied to two individual eyes Difference between the located lateral of strong real world images different provides the steering of the virtual objects or virtual information of conveying sense of depth On difference.One advantage of these methods is: the transverse shift of augmented reality image can be directed to the difference of augmented reality image It is partially different, so that perceived depth is different for those parts.In addition, transverse shift can be by augmented reality image The image procossing of each section is realized.User can experience the perceived depth of gamut by the method, from that of a physical efficiency focusing It is close to arrive infinity (regardless of the age of individual).
These Depth cue embodiments of the method for a better understanding of the present invention are remembered that following is useful: being enhanced The some aspects of reality, head-mounted display be used to add the related virtual objects of scene view seen with user Image or virtual information.In order to which additional effect to be added to the perception of augmented reality, virtual objects or virtual information are placed It is useful at perceived depth into scene.As an example, virtual tag (title of such as building) can be placed on scene In object on.If the virtual tag and the building are perceived as in the scene in same depth, virtual to mark Note and the perception relevance of the building are enhanced.Head-mounted display with see-through capabilities is perfectly suitable for providing increasing Strong reality information (such as label and object), this is because they provide a user the clear view of environment.However, for valuable The augmented reality information of value, it must be easily associated with the object in environment, and as a result, augmented reality information relative to Positioning of the object in see-through view is important.Although having the camera that can be calibrated to see-through view in head-mounted display In the case where the horizontal and vertical positioning of augmented reality information be relatively straightforward, but depth localization is more complicated.The U.S. is special Benefit 6690393 describes the method for positioning 2D label in 3D virtual world.However, this method is not for at it The major part for the image that middle user sees is not provided digitally and thus the position 3D of object is unknown see-through view Display.United States Patent (USP) 7907166 describes the robotic surgical system using anaglyph viewer, and wherein telestration illustrates It is coated on the stereo-picture of execute-in-place.However, the method described in United States Patent (USP) 6690393 of being similar to, the system Using the image of capture, these images captured are then operated to addition diagram, and thus not in wherein image The relative position for the object that major part is not provided digitally and user sees is the specific condition of unknown see-through display. Another prior art for augmented reality is to adjust the focus of virtual objects or virtual information, so that user experiences The difference of the depth of focus, the depth of focus provide a user Depth cue.As user must focus his/her eye again The object seen in scene simultaneously sees virtual objects or virtual information, and user feels associated depth.However, can be related to focus The depth bounds of connection are limited by the adaptability that the eyes of user can be realized.The adaptability is losing the suitable of them when eyes It is limited in certain individuals when the major part of answering property range, especially if individual is older.In addition, being depending on user Myopia or long sight, adaptability range are different.These factors make using the result of focus clue for having not the same year It is insecure for a large number of users group of discipline and different eye features.Therefore, it always exists to except the prior art It is widely used to be used for the needs of depth information method associated with augmented reality.
It is some by herein and next about attached drawing 109 to attached drawing in Depth cue embodiment of the method for the invention It is described in 121 paragraph.Head-mounted display with see-through capabilities the clear view of scene before user is provided and at the same time The ability of display image is also provided, wherein user sees the combination picture being made of the display image of see-through view and covering. Method needs to show 3D label and other 3D information to help user to explain the environment around user using see-through display.3D Label and the stereo pairs of other 3D information can be presented to the left eye and right eye of user, by 3D label and other 3D information Positioning is in the scene at the different depth of user's perception.By this method, 3D label and other 3D information can more easily with See-through view and ambient enviroment are associated.
Attached drawing 109 is the diagram with the head-mounted display apparatus 109100 of see-through capabilities, and is shown and passes through in Fig. 1 It wears herein come the special version of the augmented reality eyepiece 100 described.Head-mounted display apparatus 109100 may include see-through display 109110, stereoscopic camera 109120, electronic device 109130 and rangefinder 109140.Electronic device may include one in following It is a or multiple: processor, battery, GPS sensor (GPS), direction sensor, data storage, wireless communication system and use Family interface.
Figure 110 is the diagram for the scene before user seen in see-through view such as user.It is different deep in the scene Multiple objects at degree are shown to be used in discussing.In Figure 111, several objects in scene are identified and mark. However, label is presented in a manner of two-dimentional (2D) in the following manner: either only to the eyes of user present label or Label in image at same position is presented to each eyes to be consistent so that marking when being checked simultaneously.This type The label of type especially when there are foreground and background object, is seen because marking so that will mark associated with object more difficult It is entirely located in identical perceived depth up.
In order to enable it is more easily that label or other information is associated with the ideal object of environment or aspect, it will mark Or other information is rendered as three-dimensional (3D) label or other 3D information are beneficial, so that the information is perceived by the user not Same depth.This can be completed in the following manner: on the position between the covering image being covered on fluoroscopy images Two eyes that the 3D label covered in image or other 3D information are presented to the user by transverse shift, so that covering image tool There is perceived depth.For the technical staff of three-dimensional imaging field, this transverse shift between image is also known as parallax, and It causes user to change the relative positioning of his/her eye to be visually directed at image and which results in depth perceptions.Have The image of parallax is the image of the 3D label being covered in the see-through view for the scene that user sees or other 3D information.Pass through There is provided, there is the 3D of big parallax to mark, and user must slightly be directed at the optical axis of his/her eyes for the mark in stereo-picture Note brings alignment into, provides the perception that label is located at close user.3D label energy quilt with small parallax (or without parallax) can It is aligned depending on ground with the eyes of user for looking at front and this provides 3D and marks remotely located perception.
Figure 112 and 113 shows the stereo pairs of the 3D label of the see-through view to be applied shown into Figure 110.Figure 112 be to be displayed to the image of the 3D label of user's left eye, and Figure 113 is the image for being displayed to the 3D label of user's right eye. Figure 112 and Figure 113 together provide stereo pairs.In the stereo pair, 3D label is positioned laterally on Figure 112 and Figure 113 It is different between the image of middle display.Figure 114 provides the covering image of Figure 112 and Figure 113.In order to increase in Figure 114 Clarity, the 3D label from Figure 113 has been shown as grey and the 3D label in Figure 112 is shown as black.? In the prospect of Figure 114, the 3D label from Figure 113 is located in the left side of the 3D label from Figure 112 with relatively large parallax. In the background of Figure 114, the 3D label from Figure 113 is consistent with the 3D label from Figure 112 and is positioned in from Figure 112's The top of 3D label is without parallax.In the middle scene area that Figure 114 is shown, during 3D from Figure 112 and Figure 113 label has Deng parallax.3D is marked such as corresponding with user's perceived depth to this relative parallax of left eye and right eye presentation.By for The selection of 3D label marks the consistent depth of the depth of object associated therewith for a user can with 3D in scene It will be readily understood that the connection of 3D label and the object or user between the other aspects for the environment seen in see-through view.Figure 115 show the see-through view of the scene with the 3D label for showing its parallax.However, being used when checking in real life The orientation that family may change his/her eyes comes so that 3D is marked consistent in each left collection/right collection and is exactly this A perception for providing a user depth.
The calculating of parallax it is known to those skilled in the art that.For relative disparity and the equation of distance by equation 1 provides:
Z=Tf/d
Wherein Z is the distance from stereoscopic camera to object, and T is the separation distance between stereoscopic camera, and f is camera lens Focal length, and d is the parallax distance on camera sensor in scene between the image of same object.Items are rearranged to find out view Difference, equation become equation 2:
d=TF/Z
For example, for separating 120 millimeters and in conjunction with the imaging sensor with 2.2 microns of center to center pixel distance For what is used has 7 millimeters of focal length cameras, with when a display is by compared with another display visual object point moved The parallax that pixel quantity indicates is directed to some representative distances (being provided with rice) in table 1 and provided.
Table 1
Distance (rice) Parallax (pixel)
1 381.8
2 190.9
10 38.2
50 7.6
100 3.8
200 1.9
Note that sometimes in the prior art, the parallax value of stereo-picture is described using number from negative to positive, In 0 parallax for being defined in the object left at one selected location of observer, observer can feel at the selected location As in middle scape.In view of 0 point this movement, equation listed above must be adapted.When parallax value is come in this way When description, the parallax of nearly object and remote object can be identical in amplitude but opposite on symbol.
Figure 116 shows the stereo pairs captured by the stereoscopic camera 109120 on head-mounted display apparatus 109100 Diagram.Since these images capture from different perspectives, they will have with a distance from head-mounted display apparatus 109100 Corresponding parallax.In Figure 117, the view between image that two images from Figure 116 are capped to show stereo pair Difference.The parallax matches with the parallax seen in the 3D label shown for the object in Figure 114 and 115.3D label will as a result, It is perceived as being located at and is intended to the identical depth of object associated therewith with 3D label.Figure 118 shows the work seen by user The diagram marked for the 3D of the covering to the see-through view seen with left eye and right eye.
Figure 119 is the flow chart of Depth cue embodiment of the method for the invention.In step 119010, head-mounted display apparatus Electronic device 109130 in 109100 determines the GPS location of head-mounted display apparatus 109100 using GPS.In optional step 119020, electronic device 109130 determines the direction in the visual field using electronic compass.This permits a determination that visual field position and the visual field Direction, so that the object and neighbouring object in the visual field can be positioned relative to the visual field of user in the following manner: by head Wear the database that formula shows the GPS location of other objects in the GPS location and head-mounted display apparatus 109100 of equipment 109100 It is compared or is connected to other databases using being wirelessly connected.In step 119030, perpetual object or pass through electronics device Part 109130 analyzes the database being stored in equipment 109100 or by wirelessly being communicated in conjunction with another equipment come phase For the visual field mark of user.In step 119040, by by the GPS location and perpetual object of head-mounted display apparatus 109100 GPS location be compared to determine arrive perpetual object distance.In step 119050, with about perpetual object title or The related label of other information is then generated together to be felt with to corresponding at a distance from perpetual object by user with parallax Know and 3D label is provided at distance.Figure 111 shows showing for title, distance including perpetual object in the user visual field and the label of description Example.In step 119060, the left eye and right eye that the 3D label of perpetual object is shown to user with parallax are in desired depth Place provides 3D label.
Figure 120 is the flow chart of another Depth cue embodiment of the method for the present invention, wherein similar to step those of in Figure 119 Rapid step has used accompanying drawing number identical as used in Figure 119 to number.In step 120140, to the phase of perpetual object Distance and direction for the user visual field either connected by electronic device 109130 in equipment or combining wireless its Its equipment determines.The left eye and right eye that user is shown to parallax in step 120160,3D label are at desired depth 3D label is provided, and in addition, 3D label is provided in the part corresponding with the direction to perpetual object in the user visual field. Figure 111 shows that the label of wherein remote perpetual object is provided to the rear in the user visual field and has towards the remote concern The example in the direction of object is shown as label in this example and " comes to town 10 miles from this direction." this feature offer 3D letter Visual cues in breath, make user be easily navigated to perpetual object.It should be noted that 3D label can be in perspective view Other objects are provided above.
Figure 121 is the flow chart of another Depth cue embodiment of the method for the invention.In this embodiment, using distance Measuring device 109140(such as rangefinder) determine the distance of perpetual object in scene.In step 121010, close to head It wears formula and shows that one or more images of the scene of equipment 109100 are captured by using stereoscopic camera 109120.Alternatively, Single camera can be used for the one or more images for capturing scene.One or more images of the scene can be different spectrum The image of type, for example, image can be visible images, ultraviolet image, infrared light image or HYPERSPECTRAL IMAGERY.One Or multiple images are analyzed to identify one or more perpetual objects in step 121020, wherein the analysis can be by electronic device 109130 progress or image can be wirelessly sent to another equipment for analysis.In step 121030, set using range measurement The distance of perpetual object is determined for 109140.Determination makes the related view of the correlation distance of perpetual object in step 121040 Difference.In step 121050, the label or other information of perpetual object are determined.In step 121060, perpetual object is shown 3D label or other 3D information.
Figure 122 is the flow chart of another Depth cue embodiment of the method for the invention.In the present embodiment, by using vertical Body camera directly to measure the distance of object in scene to obtain the depth map of scene.In step 122010, stereoscopic camera 109120 be used to capture one or more stereo-picture groups of the scene close to head-mounted display apparatus 109100.The scene One or more stereo-picture groups can be different spectrum picture type, for example, stereo-picture can be visible images, purple Outside line light image, infrared light image or HYPERSPECTRAL IMAGERY.One or more stereo-picture groups in step 122020 be analyzed with One or more perpetual objects are identified, the wherein analysis can be carried out or one or more perspective views by electronic device 109130 As group can be wirelessly sent to another equipment for analysis.Figure in step 122030, one or more stereo-picture groups Parallax as being compared to determine one or more perpetual objects.It is related with one or more perpetual objects in step 122040 Label or other information be determined.In step 122050, the 3D label and/or 3D information of one or more perpetual objects are shown Show.
In embodiments, camera focus information can be used to provide display content and place in the present invention, such as utilizes and oneself Dynamic focus determines the integrated camera of facility binding operation, wherein the letter with the distance dependent of real-world objects into ambient enviroment Breath, which is integrated processor and determines in facility from auto-focusing, to be extracted, and wherein integrated processor determines that content exists according to this distance Placement location in the visual field of optics assembly.The visual field may include two visuals field that can be separately controlled, each and eyes of user In one alignment so that user can check peripheral region and content with two eyes, and the placement location of content includes The placement location in each of the visual field that can be separately controlled for two.Content may include two independent images, two of them Independent image is placed in be separatedly in each of visual field that two can be separately controlled, and two of them independent image is when two 3D rendering can be formed when being shown to user in a visual field being separately controlled.Placement location can by from to real-world objects Distance corresponding placement value table in extract placement value and determine.Integrated processor can calculate placement location.
In embodiments, rangefinder information can be used to provide display content and place in the present invention, such as uses and eyepiece The rangefinder to determine the distance of real-world objects in ambient enviroment is integrated and operated, and wherein integrates processing Device determines placement location of the content in the visual field of optics assembly according to this distance.The visual field, which may include two, to be separately controlled The visual field, be each aligned with one in eyes of user so that user can check peripheral region and content with two eyes, and And the placement location of content includes the placement location for each of two visuals field that can be separately controlled.Content may include two A independent image, two of them independent image are placed in be separatedly in each of visual field that two can be separately controlled, In two independent images 3D rendering can be formed when being shown to user in the visual field that can be separately controlled at two.Placement location can lead to The placement value extracted from the table of placement value corresponding at a distance from real-world objects is crossed to determine.Integrated processor Placement location can be calculated.
In embodiments, the present invention multiple distances can be used to determine sensor to show that content is placed, and such as pass through benefit It determines that the integrated distance of the distance of real-world objects in ambient enviroment determines sensor with multiple operations, and wherein collects Placement location of the content in the visual field of optics assembly is determined according to this distance at processor.The visual field, which may include two, to be divided Open the visual field of control, be each aligned with one in eyes of user so that user can be checked with two eyes peripheral region and Content, and the placement location of content includes the placement location for each of two visuals field that can be separately controlled.Content It may include two independent images, two of them independent image is placed in each in the visual field that two can be separately controlled to be separatedly In a, two of them independent image can form 3D rendering when being shown to user in the visual field that can be separately controlled at two.It places Position can be determined by the placement value extracted from the table of placement value corresponding at a distance from real-world objects.Collection Placement location can be calculated at processor.In embodiments, multiple integrated distances determine that sensor can be camera sensor, survey Distance meter etc..
In embodiments, the combination of the determining sensor of distance and the tracking of user's eye can be used to show content in the present invention It places, such as by using multiple integrated sensors (for example, camera, rangefinder) and from conjunction with the optics assembly of eyepiece The eye tracking information for the eye trace facility that ground includes, object's position (example of the Lai Jianli relative to the visual field and the position of object Such as, to the angle of object, to the distance of object).In embodiments, the present invention is available and content optics assembly view The related other facilities (position of image and placement in such as user's indirect vision) of the placement of Yezhong, using calibrating sequence, make Assisted with grid position and/or calibrate, for image at different locations to each eyes interlaced image etc..
In embodiments, the present invention can the mobile period of eyepiece provide display content-control, such as by when by with Family is adapted to the integrated mobile checkout facility of the movement of detection wear-type eyepiece when wearing, and wherein integrated processor determines shifting Dynamic type simultaneously reduces showing for shown content according to mobile type.Mobile type can be shake, quickly move It moves.The reduction shown can be the elimination of displayed content, to the reduction of the brightness of displayed content, to displayed content The reduction of contrast, to change of focus of displayed content etc..
Near-field communication (NFC) allows the short distance wireless data between NFC reader and passive NFC device to exchange, wherein NFC reader is used as " promoter " (the providing electric power for exchange) of communication and passive NFC device is used as " target " (from coming from The field the RF harvest electric power of NFC reader is to provide back data to reader).One example of this configuration can be NFC reading Device equipment, the NFC reader equipment read the identification information for coming from label (such as garment labels).It is noted that NFC also with penetrate Frequency identification (RFID) technology is mutually compatible with.If two electronic equipments include NFC reader and be brought to it is very close to each other, The NFC switched wireless of data can also be two-way.The example of this configuration can be the smart phone that two enable NFC and exist Information (for example, exchange of electronic business card) is exchanged between them, one enables the smart phone of NFC and clothes that one enables NFC Point (POS) devices exchange information of being engaged in (for example, be used for EFT, such as utilize GOOGLE wallet mobile-payment system), two A moving game exchanged between equipment information etc. for enabling NFC.NFC technique application may include EFT, mobile payment, File-sharing, electronic business card exchange, moving game, social networks connection, ticketing service purchase, boarding card check-in, POS, discount coupon are received Collection and/or exchange, tour guide station promoter, ID card, key card, vehicle or the key in hotel etc..NFC technique has about 4 centimetres The practical distance of (theoretically about 20 centimetres), and thus promoter and target closely should be used to communicate generation.
In one example, credit card information can be stored in their smart phone for enabling NFC by user, to permit Perhaps they are at retail shop by the way that their smart phone is placed in the very close POS terminal of NFC that enables come to the POS Equipment carries out electronic money payment (again, such as realizing in GOOGLE wallet mobile-payment system).In this way, User does not need to extract actual credit card out to transact business, this is because credit card information by POS terminal by NFC connection from intelligence It can be read in mobile phone.However, user, which still has, must take out from their pocket or wallet by their smart phone, lift Its close POS terminal and the inconvenience for then again taking their smart phone away.
The present invention provides for (being such as worn on their wrist by user by providing a user NFC watch device On) Lai Shixian enable NFC wireless transactions scheme, which then easily can be used for always enabling to another The equipment of NFC is lifted for data exchange.Although the present invention describes the embodiment of NFC " wrist-watch ", it is not intended that with appointing Where formula limits, and those skilled in the art will appreciate that the replacement for being able to achieve spirit of that invention is realized, be such as embodied as bracelet, Watch chain, ring etc..Each embodiment of NFC wrist-watch may include individual NFC device, NFC trunking etc., and wherein NFC relaying is set Standby building enables the equipment of NFC (for example, user with NFC target device (for example, the POS terminal for enabling NFC) and second Smart phone) both communication.For example, it may include wanting in the case where wherein NFC wrist-watch is used as individual NFC device The information (for example, credit card information) exchanged.In the case where wherein NFC wrist-watch is used as NFC trunking, which is not wrapped Information to be switched is included, but on the contrary, information to be switched is stored in another electronics that NFC trunking communicates In equipment, smart phone, mobile computing device, personal computer etc..
In embodiments, wherein NFC wrist-watch is used as NFC trunking, user can by their personal device (such as Smart phone) stay in their pocket or wallet, and only by NFC wrist-watch close to another equipment for enabling NFC for number According to exchange, wherein NFC wrist-watch provides the communication between another equipment for enabling NFC and the personal device of user.For example, with Family can wear NFC wrist-watch in their wrist, and the smart phone including their credit card information is placed on to their mouth In bag.When they are close enables the POS terminal of NFC to be used for e-payment, user can not take out their intelligence now Can mobile phone and only by their NFC wrist-watch close to POS terminal, wherein NFC wrist-watch and the smart phone of user are non-by some Close to communication link (for example, bluetooth, WiFi etc.) Lai Tongxin.NFC wrist-watch reads the credit card information of user from smart phone And transfer data to POS terminal.With this configuration, user takes out completely without by their smart phone, this be because It is them it is only necessary to which their NFC wrist-watch is made e-payment close to POS terminal, while maintains they all individuals Information and Financial Information are intensively placed in their smart phone.
In embodiments and refer to Figure 20 7, NFC wrist-watch 20702 can provide the general function of typical wrist-watch, such as with In the surface 20704 of the display of time and date 20708, function button 20710, for the embedded controller of watch function Deng.But in addition, NFC can provide communications facility, such as to the equipment for enabling NFC near-field communication, for neighbouring The medium range communication (for example, bluetooth) of electronic device, longer range communications (for example, WiFi) for arriving neighbouring electronic device etc..? In each embodiment, the antenna 20712A-B for near-field communication may be provided as the antenna 20712A(in watchband for example, having NFC loop antenna), the antenna 20712B(in table body is for example, have NFC " stamp " antenna) etc..Antenna 20712A is located at wherein In the case where in watchband, user operationally by the band portion of NFC wrist-watch hold come close to enable the target device of NFC with For data exchange.In the case where wherein antenna 20712B is located in table body, user is operationally by the table body portion of NFC wrist-watch It holds and comes close to enabling the target device of NFC for data exchange.In embodiments, watch displays 20704 can also be provided User can be made to input and/or select the control interface 20718 of information, information is such as the credit in electronic money exchanges Card number, identifying code, the data transferred accounts etc., and wherein control interface 20718 may include display, control button, 2D control pad, Touch screen etc..
With reference to Figure 20 8, an example usage scenario may include that user 20802 wears the NFC wrist-watch for being used as independent NFC device 20702A, wherein NFC wrist-watch 20702A is picked up the POS terminal 20084 for enabling NFC for by NFC communication link 20804A pays purchase.In this case, NFC wrist-watch 20702A include will be with the POS terminal that enables NFC The payment information of 20804 exchanges.In embodiments, it can previously pass through including the information in NFC wrist-watch 20702A and arrive It calculates the wired or wireless connection of facility (for example, mobile computing device, smart phone, personal computer), pass through network connection (for example, local network connection, WiFi connection) etc. is entered manually via control interface 20718.
With reference to Figure 20 9, an example usage scenario may include user 20902 wear be used as and the smart phone in user's pocket 20908 wirelessly communicate the NFC wrist-watch 20702B of the NFC trunking of 20908A, and are wherein used for the information of data exchange It is included in the smart phone 308 of user.In the case where not using NFC wrist-watch, user is by must be by their intelligent hand Machine takes out from their pocket to develop simultaneously close to enabling the POS terminal of NFC for data exchange.By using the present invention, User 20902 can stay in smart phone 20908 in their pocket, and only lift NFC wrist-watch 20702B to enabling NFC POS terminal 20804, wherein NFC wrist-watch 20702B communicates 20804A with the POS terminal 20804 for enabling NFC, is achieved in The two communication channel 20804A20908A set up via NFC wrist-watch 20702B are in smart phone 308 and the POS for enabling NFC The transmission of information between equipment 20804.In this configuration, smart phone 20908 does not need to enable NFC, because What smart phone 20908 needed is only to utilize bluetooth etc. via medium range communication link 20908A() arrive NFC wrist-watch The communication link of 20702B.
In embodiments, NFC wrist-watch can be led to by non-NFC medium range communication link with a number of other electronic equipments Letter, such as with personal computer, mobile computer, mobile communication equipment, navigation equipment, with it wear electronic equipment, enhancing Reality glasses, wear-type electronic equipment, home entertainment device, home security devices, family is automatically brought into operation equipment, local network connects Connect, personal network connection etc. communicate.For example, NFC wrist-watch can be communicated with eyepiece, all following mesh in this way of the eyepiece Mirror: the optical device including that can enable see-through display can be presented in the see-through display from integrated processor offer Content, and wherein the various aspects of eyepiece can by be related in sensor, camera, tactile interface, accessory device etc. one or Multiple complex control technology controls.In embodiments, NFC wrist-watch can be detailed to the presentation transaction in glasses is fetched with glasses Feelings.Glasses control system is used as the interface for any interaction needed for transaction.
In embodiments, NFC wrist-watch can provide computing resource, such as microcontroller, memory, independently of communication link Input/output unit (for example, memory card, wired connection), to local network wireless connection for updating, may be programmed Property etc..For example, NFC trunking can provide memory to store the commodity of the history of purchase, preference, personal profiles, sale report Valence, redemption code, the customer ID of preference, incentive message, integral planning data etc..
Method and system described herein, each embodiment of especially creative augmented reality eyepiece, can quilt Adjustment by and/or via any electronic communication system or network is communicated and is received communication.Such electronic communication The example of system and network type and their relevant agreement, topological structure, network elements etc. include the following: (1) wired network Network, such as :(a) wide area network (WAN)-uses such as point-to-point (PPP), High-Level Data Link Control (HDLC), synchrodata chain Road controls the leased line and digital subscriber line of agreements such as (SDLC);Use the circuit switching of the agreements such as PPP and ISDN; Using such as frame relay, X.25(before OSI stack), the grouping in Synchronous Optical Network/synchronization level (SONET/SDH), more assist Point for the agreements such as note switching (MPLS), the multi-megabit data service (SMDS) of switching, Ethernet (for example, 10GB, 100GB) of assessing a bid for tender Group switching;Use the cell relay of the agreement of ATM(Asynchronous Transfer Mode) agreement etc.;And network element is used, such as Router, exchanger, network hub and firewall;(b) Metropolitan Area Network below (MAN) is used: such as ATM, distributed light The agreements such as fine network interface (FDDI), SMDS, Metro Ethernet and distributed queue dual bus (DQDB);It is such as starlike, total The topological structures such as line, grid, ring-type and tree;And the network elements such as router, exchanger, network hub and firewall Element;(c) for example using Local Area Network below: the HSSI High-Speed Serial Interface agreement of such as Ethernet (for example, Ethernet, fastly Speed, 1GB, 10GB and 100GB);Such as starlike and tree topological structure;And such as router, exchanger, network hub and The network elements such as firewall;(d) using the personal area network (PAN) of the technologies such as USB and firewire;(2) nothing such as below Gauze network :(a) using wide area network below (WAN): such as RTT (CDMA), EDGE (GSM), EV-DO (CDMA/TDMA), Flash-OFDM (Flash-OFDM), GPRS (GSM), HSPA D and U (UMTS/3GSM), LTE (3GPP), UMTS-TDD (UMTS/3GSM), the standards such as mobile Internet of WIMAX (802.16), satellite, general 3G and 4G;Such as base station subsystem The network elements such as system, network and switching subsystem, GPRS nuclear network;It is operations support systems, subscriber identity module (SIM), general Terrestrial access network (UTRAN) and nuclear network, and use is with lower interface: such as W-CDMA (UTRA-FDD)-UMTS, ULTRA-TDD HCR-UMTS, TD-SCDMA-UMTS, user device interface-UMTS, radio resource control (radio link Control, media access control), Um Interface (for having with the GSM air interface of lower layer, the object of such as GMSK or 8PSK modulation Manage layer, the data link layer of such as LAPDm and the network layer of such as radio resource, mobile management and Call- Control1);(b) Use the Metropolitan Area Network (MAN) (MAN) of the agreements such as WIMAX (802.16);Use following technology Local Area Network): such as with such as The osi layer and OFDM of special and infrastructure isotype Wi-Fi, SCMA/CA etc. and spread spectrum etc. Sub- technology;And using the network elements such as router, interchanger, network hub, firewall, access point, base station and With clients such as personal computer, laptop computer, IP phone, mobile phone and smart phones; (c) using the personal area network (PAN) of star, tree and the topological structures such as netted, Web vector graphic skill such as below Art: (i) bluetooth (for example, using role's (such as master and slave and master/slave simultaneously), protocol stack, (assist by such as core agreement, cable replacement View, the agreement of telephone control protocol and use), mandatory agreement (such as LMP Link Manager Protocol (LMP), logic link control With adapting protocol (L2CAP), service discovery protocol (SDP)), matching method (such as traditional pairing and simple and safe pairing), Air interface (such as exempt from permit ISM band (2.402-2.480GHz))), (ii) infra red data as-sodation (IrDA) is (for example, use Mandatory protocol stack layers are (for example, infrared physical layer specification (IrPHY), infrared link access protocol (IrLAP), infrared link pipe Agreement (IrLMP) or optional protocol stack layers are managed (for example, small transport protocol (Tiny TP), infrared communication protocol (IrCOMM), object exchange (OBEX), LAN and Infrared (IrLAN), IrSimple and IrSimpleShot))), it is (iii) wireless USB, (iv) Z-Wave(Z- wave) (for example, mesh network topologies structure, one or more master controller routings with source routing It is modulated with safety and FGSK);(v) ZigBee(is for example, have the physics defined in 802.15.4 and medium access control layer And the application that defines of such as network layer, application layer, Zigbee device object and manufacturer and the component using CSMA/CA), (vi) body area network and (vii) Wi-Fi;(3) such as those 13.56 megahertzs with peer-to-peer network type operations and ISO/ IEC18000-3 operated and data rate with 106Kbits/s-424kbits/s and have it is passive and/ Or the near-field communication (NFC) of active communication mode.Method and system described herein, especially with the augmented reality of invention Each embodiment of eyepiece can be applied to meet any or all aspect of mobile device Network Management System, such as policy pipe Reason, user management, profile management, business intelligence, incident management, performance management, enterprise-class, multi-platform support mobile device management (including son in terms of, such as software and SaaS), safety management (including in terms of son, such as certificate control (for example, with Email, It is related using, Wi-Fi access and VPN access), password implementation, device clear, remote lock, follow-up auditing/log, central type Device configuration verifying, escape from prison/detection, safety container and the application packages of root), platform supports (for example, Android, iOS, black The certain kind of berries, Saipan, Windows be mobile and Windows mobile phone), compliance management, software management (including in terms of son, such as application downloading Device, application verification are supported using update, support (to answer for example, enterprise applies with third party using patch support, application shop With)) and hardware/equipment management (e.g., including equipment registration is (for example, ownership, classification, registration, user's identification, EULA exploitation And Limit exploitation), external memory obstruction and configuration change history.Method and system described herein especially has hair Each embodiment of the augmented reality eyepiece of bright property can be applied to be related to software as service (SaaS), platform with including those Any kind of privately owned, group or mixed cloud as service (PaaS) and/or infrastructure as the feature of service (IaaS) It calculates network or cloud computing environment is used together.
Method and system described herein can be by partly or wholly by executing computer software, journey on a processor Sequence code and/or the machine of instruction are disposed.Processor can be server, Cloud Server, client, network implementation, movement A part of computing platform, fixed computing platform or other computing platforms.Processor, which can be, is able to carry out program instruction, generation Any kind of calculating of code, binary instruction etc. or processing equipment.Processor can be or including signal processor, number at Device, embedded processor, microprocessor or any deformation are managed, the program code being stored thereon such as can be directly or indirectly promoted Or the coprocessors (mathematics coprocessors, figure coprocessors, communication coprocessors etc.) of the execution of program instruction Deng.In addition, processor is able to carry out multiple programs, thread and code.Thread can be performed simultaneously to enhance the property of processor It can and promote the simultaneously operating of application.It can be one in this as the mode of the descriptions such as realization, method, program code, program instruction It is realized in a or multiple threads.Thread can cause other lines of other assigned priority for having and being associated Journey;Processor can be according to priority or any other order come according to instruction execution these threads provided in program code. Processor may include storage herein with it is other place description methods, code, instruction and program memory.Processor can pass through Interface access is storable in method, code and the storage medium of instruction of this and other place descriptions.It is associated with processor For storage method, program, code, program instruction or it is other types of can be calculated or processing equipment execute instruction storage Medium may include but be not limited to one of the following or multiple: CD-ROM, DVD, memory, hard disk, flash memory, RAM, ROM, high speed Buffer etc..
Processor may include the one or more cores that can enhance multiprocessor speed and performance.In embodiments, it handles It is more that device can be dual core processor, four core processors, other chip-scales for combining two or more individual cores (referred to as wafer) Processor etc..
Method and system described herein can partly or completely pass through execute server, client, firewall, net The machine of pass, network hub, router or other such computers and/or the computer software on networking hardware comes portion Administration.Software program can be associated with server, which may include file server, printing server, domain server, Yin Te Network server, intranet servers and other variants, secondary servers, primary server, distributed server etc..Server It may include one of the following or multiple: memory, processor, computer-readable medium, storage medium, port (physics and void It is quasi-), communication equipment and other servers, client, machine and equipment can be accessed by wired or wireless medium etc. connect Mouthful.It can be executed herein by server with the method for other place descriptions, journey logic bomb.In addition, for executing in the present invention Other equipment of the method for description can be considered a part of facility associated with server.
Server may include the interface to other equipment, and other equipment include but is not limited to: client, other servers, Printer, database server, printing server, file server, the communication server, distributed server, social networks etc.. Additionally, the coupling and/or connection can promote the long-range execution of program across a network.The networking of some or all of these equipment The parallel processing that can promote in the program or method of one or more positions is made without departing from the scope of the present invention.In addition, passing through Interface be attached to server any equipment may include at least one can storage method, program, code and/or instruction deposit Storage media.Central repositories can provide the program instruction to execute on different devices.In this implementation, long-range repository It can be used as the storage medium of program code, instruction and program.
Software program can be associated with client, which may include file client, print client, domain client End, the Internet client, intranet client and other variants, such as secondary client, primary client, distributed clients Deng.Client may include one of the following or multiple: memory, processor, computer-readable medium, storage medium, port It (physics and virtual), communication equipment and wired or wireless medium etc. can be passed through accesses other clients, server, machine With the interface of equipment.Herein with it is other place description methods, journey logic bomb can be by client executing.In addition, for executing Other equipment of method described in the present invention can be considered a part of facility associated with client.
Client may include the interface to other equipment, and other equipment include but is not limited to: server, Cloud Server, Its client, printer, database server, printing server, file server, the communication server, distributed server, society Hand over network etc..Additionally, the coupling and/or connection can promote the long-range execution of program across a network.Some in these equipment or Whole networkings can promote the parallel processing in the program or method of one or more positions and be made without departing from the scope of the present invention. In addition, by any equipment that interface is attached to client may include at least one can storage method, program, code and/ Or the storage medium of instruction.Central repositories can provide the program instruction to execute on different devices.In this implementation, Long-range repository can be used as the storage medium of program code, instruction and program.
Method and system described herein partly or can be disposed completely by the network facilities.The network facilities may include Each element, such as calculating equipment, server, Cloud Server, router, network hub, firewall, client, individual calculus Machine, communication equipment, routing device and other activities known in the art and passive equipment, module and/or component.With the network facilities Associated calculating and/or non-computational equipment may include but be not limited to other components: storage medium, such as flash memory, buffer, Stack, RAM, ROM etc..It herein can be by the one of network facilities element with the process, methods of other place descriptions, program code, instruction It is a or multiple execute.
Method, program code and the instruction with other place descriptions can be implemented in the Cellular Networks with multiple units herein In network.Cellular network can be either frequency division multiple access (FDMA) network or be CDMA connecting mode (CDMA) net Network.Cellular network may include mobile device, cell site, base station, transponder, antenna, launching tower etc..Cellular network can be GSM, GPRS, 3G, EVDO, grid or other network types.
It can be implemented in mobile device or pass through movement in method, program code and the instruction of this and other place descriptions Equipment is realized.Mobile device may include navigation equipment, mobile phone, mobile phone, mobile personal digital assistant, laptop computer, Palmtop computer, net book, pager, E-book reader, music player etc..In addition to other components, these equipment can be wrapped Include storage medium, such as flash memory, buffer, RAM, ROM and one or more calculating equipment.Meter associated with mobile device Calculating equipment can be initiated to execute program code, method and the instruction being stored thereon.Alternatively, mobile device can be configured to It is cooperated with other equipment to execute instruction.Mobile device can be communicated with base station, which docks and be configured with server To execute program code.Mobile device can communicate on peer-to-peer network, grid network or other communication networks.Program code can quilt It is stored on storage medium associated with server and is executed by the calculating equipment being embedded in server.Base station may include Calculate equipment and storage medium.Storage equipment can store the program code executed by calculating equipment associated with base station and refer to It enables.
Computer software, program code and/or instruction can be stored on machine readable media and/or in machine readable Jie It is accessed in matter, machine readable media can include: for retaining the calculating unit for being used for the numerical data calculated for some period Part, equipment and recording medium;The referred to as semiconductor storage of random-access memory (ram);It is deposited commonly used in more longlasting The a mass storage device of storage, such as CD, similar hard disk, tape, drum, card and other types of magnetic storage form;Place Manage device register, cache memory, volatile memory, nonvolatile memory;The optical storage of such as CD, DVD;It can Move media, such as flash memory (for example, USB stick or key), floppy disk, tape, paper tape, card punch, independent ram disc, zip disk Driver, removable massive store, offline etc.;Other computer storages, such as dynamic memory, static memory, reading/ Write storage, variable storage, read-only, arbitrary access, sequential access, location addressing, file addressing, content addressed, network building-out are deposited Storage, storage area network, bar code, magnetic ink etc..
Physics and/or invisible Xiang Congyi state can be transformed into another state by method and system described herein.Herein The data for indicating physics and/or invisible item can be also transformed into another state from a state by the method and system of description.
The each element (including attached drawing in flow chart and block diagram) for being described herein and describing implies patrolling between each element Collect boundary.However, being practiced according to software or hardware engineering, discribed element and their function can be executable by computer Medium is implemented on machine, and machine, which has, is able to carry out the conduct monolithic integrated circuit software configuration being stored thereon, as independence The processing of software module or the program instruction as the module or any combination of these for using external routine, code, service etc. Device, and all such realize can be within the scope of the invention.The example of such machine may include but be not limited to, a number It is word assistant, laptop computer, personal computer, mobile phone, other Handheld computing devices, Medical Devices, wired or wireless Communication equipment, chip, calculator, satellite, board PC, e-book, gadget, electronic equipment, has artificial intelligence at frequency converter Equipment, calculate equipment, networked devices, server, router etc..In addition, the element described in flow chart and block diagram or appointing Other logic modules of anticipating, which can be implemented in, to be able to carry out on the machine of program instruction.Therefore, although above-mentioned attached drawing and description is explained In terms of the function of having stated revealed system, but in addition to clearly stating or clearly obtaining from the context in other ways, The ad hoc arrangement of software in terms of should not being inferred to from these descriptions for realizing these functions.Similarly, it can manage Solving identified above and description each step can be changed, and the order of step can be adapted to the technology disclosed herein Specific application.All these variations and modifications are intended to fall within that scope of the present disclosure interior.Accordingly, for the order of each step Description and/or description be not construed as require for these steps execution certain order, unless by specific application It needs or clearly states or clearly obtain from the context in other ways.
Process as described above and/or process and they the step of may be implemented in hardware, software or be suitable for specific answer In any combination of hardware and software.Hardware may include general purpose computer and/or dedicated computing equipment or specific calculating The particular aspects or component of equipment or specific computing device.Process can be implemented in one or more microprocessors, microcontroller, It is embedded in microcontroller, programmable digital signal processor or other programmable devices and internally and/or externally memory.It crosses Journey can also alternatively be embodied in using in specific integrated circuit, in programmable gate array, programmable array patrols Volume in any other equipment or can be configured to processing electronic signal equipment combination in.It will be further appreciated that mistake One or more of journey can be implemented as the computer-executable code that can be executed on a machine-readable medium.
Computer-executable code can be created by using following: the programming language of the structuring of C etc., such as C+ The programming language of+the object-oriented waited or any other advanced or low level programming language (including assembler language, hardware description language And database programming language and technology), these language by storage, compiling or can be explained in one of above equipment and difference The combination of the processor of type, the combination of processor architecture or different hardware and softwares or any other program that is able to carry out refer to It is run on the machine of order.
As a result, in one aspect, each method and their combination described above, which can be embodied in computer, to hold Line code, when one or more calculates and executes computer-executable code in equipment, the step of executing them.In another party Face, each method can be embodied in the system for the step of executing them, and can by with various ways distribution in a device or institute It is functional to be integrated into dedicated, autonomous device or other hardware.On the other hand, for executing and procedure described above phase The device of the step of association may include any in hardware and/or software described above.All such arrangements and combination purport It is falling within the scope of this disclosure.
Although having been incorporated with shown and detailed description preferred embodiments discloses the present invention, to each of its Kind modification and improvement are obviously to those skilled in the art.Therefore, the spirit and scope of the present invention not by with On example limited, but permitted broadest feel to be understood to restrain.
Therefore all documents cited herein pass through reference and are merged.

Claims (9)

1. a kind of system, comprising:
The interaction wear-type eyepiece that user wears, wherein the eyepiece includes checking ambient enviroment by user described in its and showing The optics assembly for the content shown, the content are video images;
Integrated processor for process content to be shown to the user;
For the content to be introduced into the integrated image source of the optics assembly;
The processor is suitably modified to the content, wherein the modification is in response to make in sensor input, it is described to repair Change at least one parameter including adjusting video capture based on the movement of the indicated user of sensor input.
2. the system as claimed in claim 1, which is characterized in that the modification further comprises at least one of: adjustment is bright Degree, adjustment colour balance, adjustment tone, adjustment video resolution, adjustment transparency, adjustment compression ratio, is adjusted adjustment color saturation Whole frame per second per second, a part that video is isolated stop playing video, suspend video or restart video.
3. the system as claimed in claim 1, which is characterized in that sensor input be from it is following at least one obtain: charge coupling It is clutch part, black silicon sensor, IR sensor, acoustic sensor, inductive pick-up, motion sensor, optical sensor, opaque Spend sensor, proximity sensor, inductance sensor, eddy current sensor, passive infrared proximity sensor, radar, capacitive displacement Sensor, hall effect sensor, Magnetic Sensor, GPS sensor, thermal imaging sensor, thermocouple, thermistor, photoelectric transfer Sensor, ultrasonic sensor, infrared laser sensor, inertia motion sensor, MEMS internal motion sensor, ultrasonic 3D motion pass Sensor, accelerometer, dipmeter, force snesor, piezoelectric transducer, rotary encoder, linear encoder, chemical sensor, ozone Sensor, smoke sensor device, heat sensor, magnetometer, carbon dioxide detector, carbon monoxide detector, oxygen sensor, Portugal Grape sugar sensor, smoke detector, metal detector, Raindrop sensor, altimeter, activity detector, object detector, label Detector, laser range finder, sonar, capacitance sensor, heart rate sensor or RF/ micropower impulse radio (MIR) sensor.
4. system as claimed in claim 2, which is characterized in that in response to the head about the user inputted from accelerometer Instruction that portion is moving and stop broadcasting content.
5. system as claimed in claim 3, which is characterized in that audio sensor inputs at least one participation by video conference Speaking for person and generate.
6. system as claimed in claim 3, which is characterized in that visual sensor input is at least one participation of video conference The video image of person.
7. system as claimed in claim 3, which is characterized in that visual sensor input is the video image of vision demonstration.
8. system as claimed in claim 6, which is characterized in that it is described modification be in response in from sensor about the use Instruction that family is being moved and make more or less transparent at least one of video image.
9. a kind of system, comprising:
The interaction wear-type eyepiece that user wears, wherein the eyepiece includes checking ambient enviroment by user described in its and showing The optics assembly for the content shown, the content are video images;
Integrated processor for process content to be shown to the user;
For the content to be introduced into the integrated image source of the optics assembly;
The processor is suitably modified to the content, wherein the modification is in response to make in sensor input, it is described to repair Change at least one parameter including adjusting video capture based on the movement of the indicated user of sensor input;With And
It further comprise integrated video picture catching facility, ambient enviroment described in the integrated video picture catching facility record On the one hand and the content is provided to show.
CN201280046955.XA 2011-09-26 2012-09-26 Video display modification based on sensor input to see-through, near-eye displays Expired - Fee Related CN103946732B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161539269P 2011-09-26 2011-09-26
US61/539,269 2011-09-26
PCT/US2012/057387 WO2013049248A2 (en) 2011-09-26 2012-09-26 Video display modification based on sensor input for a see-through near-to-eye display

Publications (2)

Publication Number Publication Date
CN103946732A CN103946732A (en) 2014-07-23
CN103946732B true CN103946732B (en) 2019-06-14

Family

ID=47996727

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201280046955.XA Expired - Fee Related CN103946732B (en) 2011-09-26 2012-09-26 Video display modification based on sensor input to see-through, near-eye displays

Country Status (5)

Country Link
EP (1) EP2761362A4 (en)
JP (1) JP2015504616A (en)
KR (1) KR20140066258A (en)
CN (1) CN103946732B (en)
WO (1) WO2013049248A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI687721B (en) 2010-11-08 2020-03-11 盧森堡商喜瑞爾工業公司 Display device
US20210200845A1 (en) * 2019-12-31 2021-07-01 Atlassian Pty Ltd. Illumination-based user authentication
WO2023076841A1 (en) * 2021-10-25 2023-05-04 Atieva, Inc. Contextual vehicle control with visual representation
US20230186800A1 (en) * 2021-12-15 2023-06-15 Motorola Mobility Llc Augmented reality display device having contextual adaptive brightness
US11800246B2 (en) 2022-02-01 2023-10-24 Landscan Llc Systems and methods for multispectral landscape mapping
EP3444667B1 (en) * 2013-05-09 2023-12-06 IMAX Theatres International Limited Methods and systems of vibrating a screen

Families Citing this family (599)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9158116B1 (en) 2014-04-25 2015-10-13 Osterhout Group, Inc. Temple and ear horn assembly for headworn computer
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US20150277120A1 (en) 2014-01-21 2015-10-01 Osterhout Group, Inc. Optical configurations for head worn computing
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
WO2011106797A1 (en) 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
WO2014071179A2 (en) * 2012-11-01 2014-05-08 Thermalens, Llc Thermally influenced changeable tint device
US9933684B2 (en) 2012-11-16 2018-04-03 Rockwell Collins, Inc. Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration
GB2515460B (en) * 2013-04-12 2016-01-06 Two Trees Photonics Ltd Near-eye device
US9417471B2 (en) 2013-04-30 2016-08-16 Research Frontiers Incorporated Method and device for protecting objects from degradation by light with suspended particle device light valves
US9280972B2 (en) 2013-05-10 2016-03-08 Microsoft Technology Licensing, Llc Speech to text conversion
US9740030B2 (en) * 2013-05-23 2017-08-22 Omnivision Technologies, Inc. Near-eye display systems, devices and methods
WO2014193326A1 (en) * 2013-05-29 2014-12-04 Baltaci Cetin Ozgur System for forming a virtual image
WO2014194066A1 (en) * 2013-05-30 2014-12-04 Charles Anthony Smith Hud object design and method
CN103336435B (en) * 2013-06-19 2015-10-28 河海大学常州校区 Gyroscope is based on the method for adaptive fuzzy sliding mode control of Attitude rate estimator
JP2015009630A (en) * 2013-06-27 2015-01-19 庸 菊池 Vehicle inspection recording unit
JP6205189B2 (en) * 2013-06-28 2017-09-27 オリンパス株式会社 Information presentation system and method for controlling information presentation system
US9563331B2 (en) * 2013-06-28 2017-02-07 Microsoft Technology Licensing, Llc Web-like hierarchical menu display configuration for a near-eye display
TW201502581A (en) 2013-07-11 2015-01-16 Seiko Epson Corp Head mounted display device and control method for head mounted display device
JP6252002B2 (en) * 2013-07-11 2017-12-27 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
EP2833196B1 (en) * 2013-08-02 2016-03-16 ESSILOR INTERNATIONAL (Compagnie Générale d'Optique) A method of controlling a programmable ophthalmic lens device
KR20150018264A (en) * 2013-08-09 2015-02-23 엘지전자 주식회사 Wearable glass-type device and control method thereof
JP6111932B2 (en) * 2013-08-26 2017-04-12 ソニー株式会社 Action support device, action support method, program, and storage medium
JP6337433B2 (en) * 2013-09-13 2018-06-06 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
US9418273B2 (en) 2013-09-18 2016-08-16 Blackberry Limited Structure for multicolor biometric scanning user interface
US9311545B2 (en) 2013-09-18 2016-04-12 Blackberry Limited Multicolor biometric scanning user interface
JP5877824B2 (en) * 2013-09-20 2016-03-08 ヤフー株式会社 Information processing system, information processing method, and information processing program
US9763071B2 (en) * 2013-09-22 2017-09-12 Ricoh Company, Ltd. Mobile information gateway for use in emergency situations or with special equipment
US20150088547A1 (en) * 2013-09-22 2015-03-26 Ricoh Company, Ltd. Mobile Information Gateway for Home Healthcare
KR102088020B1 (en) 2013-09-26 2020-03-11 엘지전자 주식회사 A head mounted display ant the method of controlling thereof
US11672459B2 (en) * 2013-10-14 2023-06-13 Neurovigil, Inc. Localized collection of biological signals, cursor control in speech-assistance interface based on biological electrical signals and arousal detection based on biological electrical signals
KR102462848B1 (en) * 2013-10-16 2022-11-03 매직 립, 인코포레이티드 Virtual or augmented reality headsets having adjustable interpupillary distance
CN103536279A (en) * 2013-10-22 2014-01-29 德赛电子(惠州)有限公司 Intelligent wristband and adaptive method thereof
US10258256B2 (en) 2014-12-09 2019-04-16 TechMah Medical Bone reconstruction and orthopedic implants
US9420178B2 (en) 2013-12-20 2016-08-16 Qualcomm Incorporated Thermal and power management
US9448621B2 (en) * 2013-12-20 2016-09-20 Nokia Technologies Oy Causation of display of information on a see through display
US10321124B2 (en) 2014-01-10 2019-06-11 Nokia Technologies Oy Display of a visual representation of a view
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US9529199B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US12105281B2 (en) 2014-01-21 2024-10-01 Mentor Acquisition One, Llc See-through computer display systems
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US12093453B2 (en) 2014-01-21 2024-09-17 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
JP6264542B2 (en) * 2014-01-30 2018-01-24 任天堂株式会社 Information processing apparatus, information processing program, information processing system, and information processing method
JP6334715B2 (en) 2014-01-31 2018-05-30 エンパイア テクノロジー ディベロップメント エルエルシー Augmented Reality Skin Evaluation
US9865088B2 (en) 2014-01-31 2018-01-09 Empire Technology Development Llc Evaluation of augmented reality skins
JP6205498B2 (en) * 2014-01-31 2017-09-27 エンパイア テクノロジー ディベロップメント エルエルシー Target person-selectable augmented reality skin
KR102177133B1 (en) 2014-01-31 2020-11-10 매직 립, 인코포레이티드 Multi-focal display system and method
KR101827550B1 (en) 2014-01-31 2018-02-08 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Augmented reality skin manager
US20150241964A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
CA2939928C (en) 2014-02-19 2021-06-22 Evergaze, Inc. Apparatus and method for improving, augmenting or enhancing vision
JP2015166816A (en) * 2014-03-04 2015-09-24 富士通株式会社 Display device, display control program, and display control method
NZ762223A (en) * 2014-03-05 2022-02-25 Univ Arizona Wearable 3d augmented reality display with variable focus and/or object recognition
US11408699B2 (en) 2014-03-21 2022-08-09 Armaments Research Company Inc. Firearm usage monitoring system
JP6504157B2 (en) * 2014-03-26 2019-04-24 ソニー株式会社 Experience induction device, experience introduction system, and experience introduction method
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
US10354551B2 (en) 2014-04-09 2019-07-16 Lg Electronics Inc. Mobile terminal and method for controlling the same
WO2015160828A1 (en) * 2014-04-15 2015-10-22 Huntington Ingalls Incorporated System and method for augmented reality display of dynamic environment information
DE102014207490B3 (en) * 2014-04-17 2015-07-02 Carl Zeiss Ag Spectacle lens for a display device to be placed on the head of a user and an image-generating display device and display device with such a spectacle lens
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9864909B2 (en) 2014-04-25 2018-01-09 Huntington Ingalls Incorporated System and method for using augmented reality display in surface treatment procedures
WO2015164755A1 (en) 2014-04-25 2015-10-29 Huntington Ingalls Incorporated Augmented reality display of dynamic target object information
CN103941953B (en) * 2014-04-28 2017-10-31 北京智谷睿拓技术服务有限公司 Information processing method and device
EP2939924A1 (en) * 2014-04-30 2015-11-04 Airbus Operations GmbH Digital crew assist
JPWO2015170555A1 (en) * 2014-05-09 2017-04-20 アルプス電気株式会社 Eyeglass-type electronic equipment
US9635257B2 (en) * 2014-05-12 2017-04-25 Gopro, Inc. Dual-microphone camera
CN103984413B (en) * 2014-05-19 2017-12-08 北京智谷睿拓技术服务有限公司 Information interacting method and information interactive device
US9710151B2 (en) 2014-05-21 2017-07-18 International Business Machines Corporation Evaluation of digital content using non-intentional user feedback obtained through haptic interface
US9600073B2 (en) 2014-05-21 2017-03-21 International Business Machines Corporation Automated adjustment of content composition rules based on evaluation of user feedback obtained through haptic interface
US9323331B2 (en) 2014-05-21 2016-04-26 International Business Machines Corporation Evaluation of digital content using intentional user feedback obtained through haptic interface
CN112987307B (en) 2014-05-30 2022-06-28 奇跃公司 Method and system for generating focal planes in virtual and augmented reality
CN111856755B (en) 2014-05-30 2022-07-19 奇跃公司 Method and system for displaying stereoscopic vision of virtual and augmented reality
CN103976715B (en) * 2014-06-09 2016-01-20 江苏启润科技有限公司 The healthy self-checking system of multifunctional human
US10147234B2 (en) 2014-06-09 2018-12-04 Huntington Ingalls Incorporated System and method for augmented reality display of electrical system information
US10915754B2 (en) 2014-06-09 2021-02-09 Huntington Ingalls Incorporated System and method for use of augmented reality in outfitting a dynamic structural space
US10504294B2 (en) 2014-06-09 2019-12-10 Huntington Ingalls Incorporated System and method for augmented reality discrepancy determination and reporting
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
EP3155560B1 (en) * 2014-06-14 2020-05-20 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
JP6292478B2 (en) * 2014-06-17 2018-03-14 コニカミノルタ株式会社 Information display system having transmissive HMD and display control program
EP2957983A1 (en) * 2014-06-18 2015-12-23 Alcatel Lucent User-wearable electronic device and system for personal computing
KR102209512B1 (en) * 2014-06-30 2021-01-29 엘지전자 주식회사 Glasses type mobile terminal
JP6905463B2 (en) * 2014-07-02 2021-07-21 アイディーエックス,エルエルシー A device and method for adjusting the alignment between the subject's eye and the optical axis of the eyepiece imaging device.
US20170143494A1 (en) * 2014-07-10 2017-05-25 Mohamed R. Mahfouz Bone Reconstruction and Orthopedic Implants
KR101629758B1 (en) * 2014-07-11 2016-06-24 넥시스 주식회사 Method and program with the unlock system of wearable glass device
WO2016006949A1 (en) * 2014-07-11 2016-01-14 넥시스 주식회사 System and method for processing data using wearable device
US9898867B2 (en) 2014-07-16 2018-02-20 Huntington Ingalls Incorporated System and method for augmented reality display of hoisting and rigging information
WO2016013692A1 (en) * 2014-07-22 2016-01-28 엘지전자(주) Head mounted display and control method thereof
EP2977855B1 (en) * 2014-07-23 2019-08-28 Wincor Nixdorf International GmbH Virtual keyboard and input method for a virtual keyboard
CN104090385B (en) * 2014-07-25 2015-11-18 金陵科技学院 A kind of anti-cheating intelligent glasses
KR102433291B1 (en) * 2014-07-31 2022-08-17 삼성전자주식회사 Method and wearable glasses for providing a content
WO2016017997A1 (en) 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US9965030B2 (en) * 2014-07-31 2018-05-08 Samsung Electronics Co., Ltd. Wearable glasses and method of displaying image via the wearable glasses
US9959591B2 (en) * 2014-07-31 2018-05-01 Seiko Epson Corporation Display apparatus, method for controlling display apparatus, and program
EP3175293B1 (en) * 2014-07-31 2020-11-04 Vuzix Corporation Image and wave field projection through diffusive media
DE102014215372A1 (en) * 2014-08-05 2016-02-11 Conti Temic Microelectronic Gmbh Driver assistance system
EP3193202B1 (en) * 2014-09-11 2022-06-22 Huawei Technologies Co., Ltd. Mobile terminal
KR101728408B1 (en) * 2014-09-22 2017-04-19 (주)에프엑스기어 Apparatus and method for low latency simulation using estimation of orientation, and computer program for the same
JP6346537B2 (en) * 2014-09-29 2018-06-20 株式会社Nttドコモ Travel plan output system
WO2016060293A1 (en) * 2014-10-15 2016-04-21 엘지전자 주식회사 Image information display device and control method therefor
US9283138B1 (en) 2014-10-24 2016-03-15 Keith Rosenblum Communication techniques and devices for massage therapy
US10108256B2 (en) 2014-10-30 2018-10-23 Mediatek Inc. Systems and methods for processing incoming events while performing a virtual reality session
EP3015975A1 (en) * 2014-10-30 2016-05-04 Speech Processing Solutions GmbH Steering device for a dictation machine
WO2016070248A1 (en) * 2014-11-03 2016-05-12 Terrabuio Junior José Evangelista Immersive augmented virtual reality spectacles for use with smartphones, tablets, phablets and/or mobile cpus with a screen
US10286308B2 (en) 2014-11-10 2019-05-14 Valve Corporation Controller visualization in virtual and augmented reality environments
US9462455B2 (en) * 2014-11-11 2016-10-04 Sony Corporation Dynamic user recommendations for ban enabled media experiences
US9366883B2 (en) 2014-11-13 2016-06-14 International Business Machines Corporation Using google glass to project a red overlay that enhances night vision
CN104394317A (en) * 2014-11-20 2015-03-04 段然 Method for processing recorded images of head-wearing recording equipment
CN104702911A (en) * 2014-11-24 2015-06-10 段然 Wearable video device real-time wireless transmission method
US9811954B2 (en) 2014-12-02 2017-11-07 Honeywell International, Inc. Near-to-eye display systems and methods for verifying aircraft components
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
CN104483753A (en) * 2014-12-04 2015-04-01 上海交通大学 Auto-registration transmission type head-wearing display equipment
KR102362727B1 (en) 2014-12-18 2022-02-15 엘지이노텍 주식회사 Apparatus for measuring user's pulse, and computing apparatus using the apparatus
EP3234741A4 (en) * 2014-12-18 2018-08-22 Facebook, Inc. Method, system and device for navigating in a virtual reality environment
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
US10108832B2 (en) * 2014-12-30 2018-10-23 Hand Held Products, Inc. Augmented reality vision barcode scanning system and method
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
JP6451322B2 (en) * 2015-01-06 2019-01-16 セイコーエプソン株式会社 Image display device
CN107111139A (en) 2015-01-08 2017-08-29 阿什克伦眼镜科技有限公司 Apparatus and method for display content
KR102320737B1 (en) 2015-01-14 2021-11-03 삼성디스플레이 주식회사 Head mounted electronic device
EP4365665A3 (en) 2015-01-26 2024-07-24 Magic Leap, Inc. Virtual and augmented reality systems and methods having improved diffractive grating structures
KR101765838B1 (en) 2015-01-29 2017-08-10 유퍼스트(주) Wearable device for visual handicap person
US9830001B2 (en) * 2015-02-03 2017-11-28 Sony Mobile Communications Inc. Method, device and system for collecting writing pattern using ban
CN104576709B (en) * 2015-02-03 2017-07-04 京东方科技集团股份有限公司 Oled display substrate and preparation method thereof, wearable device
EP3054371A1 (en) * 2015-02-06 2016-08-10 Nokia Technologies OY Apparatus, method and computer program for displaying augmented information
CN105988562A (en) * 2015-02-06 2016-10-05 刘小洋 Intelligent wearing equipment and method for realizing gesture entry based on same
US9632226B2 (en) 2015-02-12 2017-04-25 Digilens Inc. Waveguide grating device
KR102309451B1 (en) * 2015-02-13 2021-10-07 주식회사 엘지유플러스 Wearable Device and Control Method of Displaying on the Device Thereof
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
CN107710009B (en) * 2015-02-27 2021-06-29 威尔乌集团 Controller visualization in virtual and augmented reality environments
JP2017009573A (en) * 2015-03-06 2017-01-12 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Wearing terminal and method of controlling wearing terminal
KR102630754B1 (en) * 2015-03-16 2024-01-26 매직 립, 인코포레이티드 Augmented Reality Pulse Oximetry
CN104657103B (en) * 2015-03-16 2017-06-16 哈尔滨工业大学 Hand-held CAVE optical projection systems based on depth camera
CN107209483A (en) * 2015-03-20 2017-09-26 华为技术有限公司 Intelligent interaction method, device and system
JP6683367B2 (en) * 2015-03-30 2020-04-22 国立大学法人東北大学 Biological information measuring device, biological information measuring method, and biological information measuring program
WO2016158624A1 (en) * 2015-03-30 2016-10-06 国立大学法人東北大学 Biological information measurement device, biological information measurement method, biological information display device and biological information display method
CN104731338B (en) * 2015-03-31 2017-11-14 深圳市虚拟现实科技有限公司 One kind is based on enclosed enhancing virtual reality system and method
JP2016192122A (en) 2015-03-31 2016-11-10 ソニー株式会社 Information processing apparatus, information processing method, and program
US9767171B2 (en) * 2015-04-03 2017-09-19 Oracle International Corporation Method and system for implementing an operating system hook in a log analytics system
CN104765456A (en) * 2015-04-08 2015-07-08 成都爱瑞斯文化传播有限责任公司 Virtual space system and building method thereof
JP6426525B2 (en) * 2015-04-20 2018-11-21 ファナック株式会社 Display system
WO2016170765A1 (en) * 2015-04-20 2016-10-27 日本電気株式会社 Target identification system, target identification method and program storage medium
KR102365492B1 (en) * 2015-04-22 2022-02-18 삼성전자주식회사 Wearable device
JP6646361B2 (en) * 2015-04-27 2020-02-14 ソニーセミコンダクタソリューションズ株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
IL244255A (en) 2016-02-23 2017-04-30 Vertical Optics Llc Wearable vision redirecting devices
US9690119B2 (en) 2015-05-15 2017-06-27 Vertical Optics, LLC Wearable vision redirecting devices
IL295437B2 (en) * 2015-05-19 2024-11-01 Magic Leap Inc Dual integrated light field device
US10698535B2 (en) 2015-05-21 2020-06-30 Nec Corporation Interface control system, interface control apparatus, interface control method, and program
IL239191A0 (en) * 2015-06-03 2015-11-30 Amir B Geva Image classification system
CN104883543A (en) * 2015-06-04 2015-09-02 段然 Data acquisition system for uncompressed image transmission
CN104967887B (en) * 2015-06-06 2018-03-30 深圳市虚拟现实科技有限公司 Information interacting method and virtual reality glasses based on NFC
KR102196507B1 (en) 2015-06-09 2020-12-30 한국전자통신연구원 Apparatus for visible light communication using electrically switchable glass and method using same
EP3923229A1 (en) * 2015-06-24 2021-12-15 Magic Leap, Inc. Augmented reality devices, systems and methods for purchasing
US9530426B1 (en) 2015-06-24 2016-12-27 Microsoft Technology Licensing, Llc Filtering sounds for conferencing applications
CN106326813B (en) * 2015-06-30 2023-04-07 深圳指芯智能科技有限公司 Intelligent variable-frequency 3D fingerprint sensor
JP7074478B2 (en) 2015-07-03 2022-05-24 エシロール アンテルナショナル Methods and systems for augmented reality
CN105093555B (en) * 2015-07-13 2018-08-14 深圳多新哆技术有限责任公司 Short distance optical amplifier module and the nearly eye display optics module for using it
CN105070204A (en) * 2015-07-24 2015-11-18 江苏天晟永创电子科技有限公司 Miniature AMOLED optical display
CN105022980B (en) * 2015-07-28 2017-11-14 福建新大陆电脑股份有限公司 A kind of bar code image recognizing apparatus
KR20170014028A (en) 2015-07-28 2017-02-08 현대자동차주식회사 Hands-free inspection apparatus and method for controlling the same
CN109409251B (en) 2015-08-18 2023-05-16 奇跃公司 Virtual and augmented reality systems and methods
KR102260483B1 (en) * 2015-08-25 2021-06-04 한국전자기술연구원 Smart glasses for display a fly guide information
CN105100745B (en) * 2015-08-31 2018-03-23 国网浙江省电力公司湖州供电公司 A substation operation monitoring device
EP3138478B1 (en) * 2015-09-01 2023-11-01 Essilor International A heart rate sensing wearable device
JP2017049762A (en) 2015-09-01 2017-03-09 株式会社東芝 System and method
CN105091948A (en) * 2015-09-02 2015-11-25 徐艺斌 Multifunctional sensor module for myopia prevention frame
KR101716326B1 (en) * 2015-09-08 2017-03-14 클릭트 주식회사 Method and program for transmitting and playing virtual reality image
CN105259655A (en) * 2015-09-10 2016-01-20 上海理鑫光学科技有限公司 3D video system improving authenticity of virtual and actual superposition
EP3145168A1 (en) * 2015-09-17 2017-03-22 Thomson Licensing An apparatus and a method for generating data representing a pixel beam
CN105117111B (en) * 2015-09-23 2019-11-15 小米科技有限责任公司 The rendering method and device of virtual reality interactive picture
CN114838733A (en) 2015-09-25 2022-08-02 苹果公司 Non-solid object monitoring
US10168769B2 (en) 2015-09-28 2019-01-01 Nec Corporation Input apparatus, input method, and program
CN108027654B (en) 2015-09-28 2021-01-12 日本电气株式会社 Input device, input method, and program
JP6598269B2 (en) * 2015-10-05 2019-10-30 ディジレンズ インコーポレイテッド Waveguide display
JP6662599B2 (en) * 2015-10-05 2020-03-11 ミツミ電機株式会社 Display device
JPWO2017064926A1 (en) * 2015-10-15 2018-08-02 ソニー株式会社 Information processing apparatus and information processing method
US10338677B2 (en) * 2015-10-28 2019-07-02 Microsoft Technology Licensing, Llc Adjusting image frames based on tracking motion of eyes
CN106814844A (en) * 2015-12-01 2017-06-09 深圳市掌网科技股份有限公司 A kind of virtual reality interactive system and method
CN105976424A (en) * 2015-12-04 2016-09-28 乐视致新电子科技(天津)有限公司 Image rendering processing method and device
CN105455792B (en) * 2015-12-18 2018-09-25 济南中景电子科技有限公司 Headband for virtual reality glasses
CN105487229B (en) * 2015-12-18 2018-05-04 济南中景电子科技有限公司 Multi-modal interaction virtual reality glasses
CN108474959A (en) 2015-12-22 2018-08-31 E-视觉智能光学公司 Dynamic Focus Head Mounted Display
CN105608436B (en) * 2015-12-23 2021-10-22 联想(北京)有限公司 Power consumption control method and electronic equipment
WO2017117519A1 (en) * 2015-12-30 2017-07-06 Surefire Llc Optical narrowcasting
WO2017114755A1 (en) * 2015-12-31 2017-07-06 Thomson Licensing Configuration for rendering virtual reality with an adaptive focal plane
CN105455285B (en) * 2015-12-31 2019-02-12 北京小鸟看看科技有限公司 A kind of virtual implementing helmet adaptation method
KR102439768B1 (en) * 2016-01-07 2022-09-01 매직 립, 인코포레이티드 Virtual and augmented reality systems and methods with an unequal number of component color images distributed across depth planes
JP6952713B2 (en) * 2016-01-19 2021-10-20 マジック リープ, インコーポレイテッドMagic Leap,Inc. Augmented reality systems and methods that utilize reflection
KR102610120B1 (en) * 2016-01-20 2023-12-06 삼성전자주식회사 Head mounted display and control method thereof
CN105718167A (en) * 2016-01-21 2016-06-29 陈佩珊 Icon migration achieving method and system based on intelligent glasses leg touch
CN105739851A (en) * 2016-01-21 2016-07-06 陈佩珊 Icon migration realization method and system based on voice identification of smart glasses
US10908694B2 (en) * 2016-02-01 2021-02-02 Microsoft Technology Licensing, Llc Object motion tracking with remote device
CN105704501B (en) * 2016-02-06 2020-04-21 普宙飞行器科技(深圳)有限公司 Virtual reality live broadcast system based on unmanned aerial vehicle panoramic video
JP6341343B2 (en) 2016-02-08 2018-06-13 日本電気株式会社 Information processing system, information processing apparatus, control method, and program
US10372229B2 (en) 2016-02-25 2019-08-06 Nec Corporation Information processing system, information processing apparatus, control method, and program
CN205582205U (en) * 2016-03-02 2016-09-14 福州领头虎软件有限公司 Human situation and action monitoring alarm system
CN105720347A (en) * 2016-03-16 2016-06-29 昆山联滔电子有限公司 Watch chain-type window antenna
JP6493264B2 (en) * 2016-03-23 2019-04-03 横河電機株式会社 Maintenance information sharing apparatus, maintenance information sharing method, maintenance information sharing program, and recording medium
US10838502B2 (en) * 2016-03-29 2020-11-17 Microsoft Technology Licensing, Llc Sharing across environments
CN109073819A (en) 2016-04-07 2018-12-21 奇跃公司 System and method for augmented reality
US9910284B1 (en) 2016-09-08 2018-03-06 Osterhout Group, Inc. Optical systems for head-worn computers
WO2017176898A1 (en) 2016-04-08 2017-10-12 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US10134198B2 (en) 2016-04-19 2018-11-20 Adobe Systems Incorporated Image compensation for an occluding direct-view augmented reality system
US11017257B2 (en) * 2016-04-26 2021-05-25 Sony Corporation Information processing device, information processing method, and program
CN105975060A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Virtual reality terminal as well as control method and apparatus therefor
CA2999057C (en) * 2016-04-27 2023-12-05 Rovi Guides, Inc. Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment
IL289757B2 (en) 2016-05-09 2024-12-01 Magic Leap Inc Augmented reality systems and methods for analyzing user health
US11327475B2 (en) * 2016-05-09 2022-05-10 Strong Force Iot Portfolio 2016, Llc Methods and systems for intelligent collection and analysis of vehicle data
JP2019516153A (en) * 2016-05-12 2019-06-13 サーク・コーポレーション Controller Signs with Capacitive Sensing
CN107402378A (en) * 2016-05-19 2017-11-28 财团法人金属工业研究发展中心 Frequency modulation radar transceiver
US10482668B2 (en) * 2016-06-02 2019-11-19 Thales Visionix, Inc. Miniature vision-inertial navigation system with extended dynamic range
KR101859909B1 (en) 2016-06-07 2018-05-21 에스아이에스 주식회사 System and Method for Precasting and Tracking Red Tied Using Drone
JP6843530B2 (en) 2016-06-15 2021-03-17 任天堂株式会社 Game systems, methods, and game programs
CN106094203A (en) * 2016-06-16 2016-11-09 捷开通讯(深圳)有限公司 VR system, for controlling wearable device and the method thereof of VR equipment
EP3472828B1 (en) 2016-06-20 2022-08-10 Magic Leap, Inc. Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions
KR101817952B1 (en) * 2016-06-23 2018-01-12 주식회사 맥스트 See-through type head mounted display apparatus and method of controlling display depth thereof
CN107544661B (en) * 2016-06-24 2020-06-23 联想(北京)有限公司 Information processing method and electronic equipment
US10754161B2 (en) * 2016-07-12 2020-08-25 Mitsubishi Electric Corporation Apparatus control system
CN106200972A (en) * 2016-07-14 2016-12-07 乐视控股(北京)有限公司 A kind of method and device adjusting virtual reality scenario parameter
CN112859563B (en) * 2016-07-24 2022-07-05 光场实验室公司 Calibration method for holographic energy-guided systems
CN106267552B (en) 2016-07-25 2020-03-10 京东方科技集团股份有限公司 Wearable device, virtual reality method and terminal system
CN109564428B (en) 2016-07-29 2022-05-31 日本电气方案创新株式会社 Moving object operating system, operation signal transmitting system, moving object operating method, program, and recording medium
JP6744033B2 (en) 2016-07-29 2020-08-19 Necソリューションイノベータ株式会社 Mobile control system, control signal transmission system, mobile control method, program, and recording medium
CN106162206A (en) * 2016-08-03 2016-11-23 北京疯景科技有限公司 Panorama recording, player method and device
CH712799A1 (en) * 2016-08-10 2018-02-15 Derungs Louis Virtual reality method and system implementing such method.
US10402649B2 (en) * 2016-08-22 2019-09-03 Magic Leap, Inc. Augmented reality display device with deep learning sensors
US10557943B2 (en) * 2016-08-22 2020-02-11 Apple Inc. Optical systems
CN106239513A (en) * 2016-08-29 2016-12-21 合肥凌翔信息科技有限公司 A kind of remote controlled robot system
KR102746273B1 (en) * 2016-09-08 2024-12-26 엘지전자 주식회사 Head mounted display and method for controlling the same
WO2018047433A1 (en) * 2016-09-08 2018-03-15 ソニー株式会社 Information processing device
CN106408303A (en) * 2016-09-21 2017-02-15 上海星寰投资有限公司 Payment method and system
CN106251153A (en) * 2016-09-21 2016-12-21 上海星寰投资有限公司 A kind of method of payment and system
CN106203410B (en) * 2016-09-21 2023-10-17 上海星寰投资有限公司 Identity verification method and system
WO2018053819A1 (en) 2016-09-24 2018-03-29 华为技术有限公司 Offline management method for application use time, and terminal device
CN107885311A (en) * 2016-09-29 2018-04-06 深圳纬目信息技术有限公司 A kind of confirmation method of visual interactive, system and equipment
KR102743204B1 (en) * 2016-10-17 2024-12-17 엘지전자 주식회사 Head mounted display device
KR102662708B1 (en) * 2016-10-17 2024-05-03 엘지전자 주식회사 Head mounted display device
WO2018075968A1 (en) * 2016-10-21 2018-04-26 Magic Leap, Inc. System and method for presenting image content on multiple depth planes by providing multiple intra-pupil parallax views
CN106507121A (en) * 2016-10-31 2017-03-15 易瓦特科技股份公司 A kind of live method of control, VR equipment and unmanned plane
CN106651355A (en) * 2016-11-08 2017-05-10 北京小米移动软件有限公司 Payment method and device, and virtual reality helmet
KR102650572B1 (en) * 2016-11-16 2024-03-26 삼성전자주식회사 Electronic apparatus and method for controlling thereof
EP3470976A1 (en) 2017-10-12 2019-04-17 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and apparatus for efficient delivery and usage of audio messages for high quality of experience
US20180144554A1 (en) 2016-11-18 2018-05-24 Eyedaptic, LLC Systems for augmented reality visual aids and tools
US10466774B2 (en) * 2016-11-22 2019-11-05 Honeywell International Inc. NTE display systems and methods with optical trackers
EP3548939A4 (en) 2016-12-02 2020-11-25 DigiLens Inc. WAVE GUIDE DEVICE WITH UNIFORM OUTPUT LIGHTING
US10055028B2 (en) 2016-12-05 2018-08-21 Google Llc End of session detection in an augmented and/or virtual reality environment
US9906290B2 (en) * 2016-12-06 2018-02-27 Mediatek Singapore Pte. Ltd. Method for network merging and configuration sharing and associated apparatus
KR20180065515A (en) 2016-12-08 2018-06-18 박순구 Multifunctional wearable display apparatus
CN108205416B (en) * 2016-12-20 2021-06-18 法法汽车(中国)有限公司 Method for activating terminal screen by using vehicle machine, vehicle machine and intelligent vehicle
CN106603107B (en) * 2016-12-21 2019-10-29 Tcl移动通信科技(宁波)有限公司 A kind of helmet and its control method
JP6382928B2 (en) * 2016-12-27 2018-08-29 株式会社コロプラ Method executed by computer to control display of image in virtual space, program for causing computer to realize the method, and computer apparatus
JP6255470B1 (en) * 2016-12-27 2017-12-27 株式会社Qdレーザ Retina scanning optometry apparatus, retinal scanning optometry system, retinal scanning optometry method, retinal scanning eyewear providing system, retinal scanning eyewear providing method, and retinal scanning eyewear
US10455165B2 (en) 2016-12-28 2019-10-22 Microsoft Technology Licensing, Llc Systems, methods, and computer-readable media for using a video capture device to alleviate motion sickness via an augmented display for a passenger
US10147460B2 (en) * 2016-12-28 2018-12-04 Immersion Corporation Haptic effect generation for space-dependent content
CN106790579B (en) * 2016-12-28 2020-07-03 深圳市明瞳视光科技有限公司 Active information transmission method and system based on intelligent glasses
US10088911B2 (en) 2016-12-30 2018-10-02 Manuel Saez Programmable electronic helmet
WO2018122859A1 (en) * 2016-12-31 2018-07-05 Lumus Ltd. Eye tracker based on retinal imaging via light-guide optical element
CN106681955B (en) * 2017-01-04 2023-05-09 四川埃姆克伺服科技有限公司 Universal interface circuit for receiving signals from a servo motor position sensor
US10545346B2 (en) 2017-01-05 2020-01-28 Digilens Inc. Wearable heads up displays
CN106864362A (en) * 2017-01-18 2017-06-20 陈宗坤 A kind of warning sign with purification of air
CN106846383B (en) * 2017-01-23 2020-04-17 宁波诺丁汉大学 High dynamic range image imaging method based on 3D digital microscopic imaging system
WO2018139020A1 (en) * 2017-01-24 2018-08-02 ソニー株式会社 Hinge mechanism and head-mounted display comprising said hinge mechanism
US20180212314A1 (en) * 2017-01-24 2018-07-26 Intel Corporation Wearable device sar reduction and antenna improvement
US11125520B2 (en) 2017-01-27 2021-09-21 Armaments Research Company, Inc. Firearm usage monitoring system providing alerts for ammunition resupply
US11215416B2 (en) 2017-01-27 2022-01-04 Armaments Research Company, Inc. Weapon monitoring system with a map-based dashboard interface
US11561058B2 (en) 2017-01-27 2023-01-24 Armaments Research Company Inc. Weapon usage monitoring system with situational state analytics
US11395628B2 (en) * 2017-02-16 2022-07-26 Samsung Electronics Co., Ltd. Method of providing service based on biometric information and wearable electronic device
EP4621459A3 (en) 2017-02-22 2025-12-31 Lumus Ltd. OPTICAL FIBER GUIDE ARRANGEMENT
KR102574219B1 (en) 2017-02-23 2023-09-01 매직 립, 인코포레이티드 Variable-focus virtual image devices based on polarization conversion
CN106597673B (en) * 2017-02-28 2020-04-03 京东方科技集团股份有限公司 Virtual reality display device and its driving method and driving module
CN106932906A (en) * 2017-03-04 2017-07-07 国家电网公司 Mixed reality display device
CN112270260A (en) * 2017-03-06 2021-01-26 苏州佳世达光电有限公司 Identification method and electronic equipment
WO2018164914A2 (en) * 2017-03-07 2018-09-13 Apple Inc. Head-mounted display system
EP3376279B1 (en) * 2017-03-13 2022-08-31 Essilor International Optical device for a head-mounted display, and head-mounted device incorporating it for augmented reality
WO2018167843A1 (en) 2017-03-14 2018-09-20 日本電気株式会社 Information processing device, information processing system, control method, and program
KR20240069826A (en) * 2017-03-21 2024-05-20 매직 립, 인코포레이티드 Low-profile beam splitter
KR102579249B1 (en) 2017-03-21 2023-09-15 매직 립, 인코포레이티드 Methods, devices, and systems for illuminating spatial light modulators
CN113341566B (en) 2017-03-22 2023-12-15 鲁姆斯有限公司 Overlapping reflective surface constructions
CN106842576A (en) * 2017-03-23 2017-06-13 核桃智能科技(常州)有限公司 It is a kind of to wear intelligent display device with functionality mobile communication
JP2018170656A (en) * 2017-03-30 2018-11-01 ソニーセミコンダクタソリューションズ株式会社 Image capturing device, image capturing module, image capturing system, and control method of image capturing device
CN107015655A (en) * 2017-04-11 2017-08-04 苏州和云观博数字科技有限公司 Museum virtual scene AR experiences eyeglass device and its implementation
CN106897576B (en) * 2017-04-17 2023-10-31 安徽咏鹅家纺股份有限公司 Intelligent sleep monitoring and sleep-aiding cloud service system
CN106871973A (en) * 2017-04-21 2017-06-20 佛山市川东磁电股份有限公司 A kind of Temperature Humidity Sensor
JP7141410B2 (en) 2017-05-01 2022-09-22 マジック リープ, インコーポレイテッド Matching Content to Spatial 3D Environments
KR20180123354A (en) * 2017-05-08 2018-11-16 엘지전자 주식회사 User interface apparatus for vehicle and Vehicle
US10795178B2 (en) * 2017-05-09 2020-10-06 Amtran Technology Co., Ltd. Device for mixed reality
CN107071285A (en) * 2017-05-16 2017-08-18 广东交通职业技术学院 One kind is with shooting method, memory and unmanned plane with clapping device
CN107221066A (en) * 2017-05-16 2017-09-29 嘉兴市天篷农业休闲有限公司 A kind of Tourist Experience strengthens AR systems
CN108955396A (en) * 2017-05-17 2018-12-07 广东建元和安科技发展有限公司 A kind of hand-held anti-sniper active probe device
GB2552872B (en) * 2017-05-17 2018-08-29 Vision Rt Ltd Patient monitoring system
CN108958461A (en) * 2017-05-24 2018-12-07 宏碁股份有限公司 Virtual reality system with adaptive control and control method thereof
JP6947661B2 (en) * 2017-05-26 2021-10-13 株式会社コロプラ A program executed by a computer capable of communicating with the head mount device, an information processing device for executing the program, and a method executed by a computer capable of communicating with the head mount device.
CN107193381A (en) * 2017-05-31 2017-09-22 湖南工业大学 A kind of intelligent glasses and its display methods based on eyeball tracking sensing technology
JP6613267B2 (en) 2017-06-02 2019-11-27 任天堂株式会社 Information processing system, information processing program, information processing apparatus, and information processing method
JP6837921B2 (en) 2017-06-02 2021-03-03 任天堂株式会社 Game programs, information processing devices, information processing systems, and information processing methods
JP6653293B2 (en) 2017-06-05 2020-02-26 任天堂株式会社 Information processing system, information processing program, information processing apparatus, and information processing method
KR102482756B1 (en) * 2017-06-14 2022-12-30 삼성전자주식회사 Head-mounted display apparatus
CN107679380B (en) * 2017-06-22 2020-08-11 国网浙江平湖市供电公司 An intelligent inspection device and method based on identity recognition
CN109116559B (en) 2017-06-26 2024-06-11 京东方科技集团股份有限公司 Display system and image display method
KR102347128B1 (en) * 2017-06-29 2022-01-05 한국전자기술연구원 High visibility microdisplay device and HMD comprising the same
US10838499B2 (en) * 2017-06-29 2020-11-17 Apple Inc. Finger-mounted device with sensors and haptics
CN109215132A (en) * 2017-06-30 2019-01-15 华为技术有限公司 A kind of implementation method and equipment of augmented reality business
EP4328656B1 (en) * 2017-07-06 2025-12-17 Magic Leap, Inc. Speckle-reduction in virtual and augmented reality systems and methods
US20190012841A1 (en) 2017-07-09 2019-01-10 Eyedaptic, Inc. Artificial intelligence enhanced system for adaptive control driven ar/vr visual aids
US10578869B2 (en) 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
US10422995B2 (en) 2017-07-24 2019-09-24 Mentor Acquisition One, Llc See-through computer display systems with stray light management
US11409105B2 (en) 2017-07-24 2022-08-09 Mentor Acquisition One, Llc See-through computer display systems
KR102026526B1 (en) * 2017-08-03 2019-09-30 주식회사 에스지엠 Authentication system using bio-information and screen golf system using the same
CN107422480A (en) * 2017-08-03 2017-12-01 深圳市汇龙天成科技有限公司 A kind of semi-transparent semi-reflecting toroidal lens shows structure and display methods
US10969584B2 (en) 2017-08-04 2021-04-06 Mentor Acquisition One, Llc Image expansion optic for head-worn computer
KR102485447B1 (en) 2017-08-09 2023-01-05 삼성전자주식회사 Optical window system and see-through type display apparatus including the same
WO2019035600A1 (en) * 2017-08-15 2019-02-21 Samsung Electronics Co., Ltd. System and method for displaying real or virtual scene
JP6958106B2 (en) * 2017-08-21 2021-11-02 セイコーエプソン株式会社 Manufacturing method of deflector, display device and deflector
US11607600B2 (en) 2017-08-24 2023-03-21 Vuzix Corporation Swim AR goggles
CN107609492B (en) * 2017-08-25 2019-06-21 西安电子科技大学 Distorted image quality perceptual evaluation method based on EEG signal
CN115291720A (en) 2017-08-29 2022-11-04 苹果公司 Electronic device with adaptive display
JP6458106B1 (en) * 2017-09-13 2019-01-23 株式会社コロプラ Method executed by computer to provide content in moving means, program for causing computer to execute the method, content providing apparatus, and content providing system
JP6987737B2 (en) * 2017-09-13 2022-01-05 株式会社コロプラ A method performed on a computer to provide content in a means of transportation, a program that causes the computer to execute the method, a content providing device, and a content providing system.
JP7020490B2 (en) 2017-09-20 2022-02-16 日本電気株式会社 Information processing equipment, control methods, and programs
CN107657791A (en) * 2017-09-29 2018-02-02 歌尔股份有限公司 A kind of VR/AR helmets
CA3077455A1 (en) 2017-10-11 2019-04-18 Magic Leap, Inc. Augmented reality display comprising eyepiece having a transparent emissive display
FR3072468B1 (en) * 2017-10-13 2020-02-14 Alessandro Manneschi DEVICE AND METHOD FOR DETECTING UNAUTHORIZED OBJECTS OR MATERIALS CARRIED BY AN INDIVIDUAL IN A PROTECTED ACCESS AREA
US11086315B2 (en) 2017-10-26 2021-08-10 2KR Systems, LLC Building rooftop intelligence gathering, decision-support and snow load removal system for protecting buildings from excessive snow load conditions, and automated methods for carrying out the same
US10984508B2 (en) 2017-10-31 2021-04-20 Eyedaptic, Inc. Demonstration devices and methods for enhancement for low vision users and systems improvements
KR102063780B1 (en) * 2017-11-21 2020-01-08 고려대학교산학협력단 Virtual reality device for preventing myopic progression
IL255955B (en) 2017-11-27 2019-06-30 Elbit Systems Ltd System and method for providing synthetic infromation on a see-through device
KR102028997B1 (en) * 2017-11-29 2019-10-07 엘지디스플레이 주식회사 Head mount display device
TWI660630B (en) 2017-12-06 2019-05-21 瑞昱半導體股份有限公司 Method and system for detecting video scan type
CN109918975B (en) * 2017-12-13 2022-10-21 腾讯科技(深圳)有限公司 An augmented reality processing method, object recognition method and terminal
EP3729243A4 (en) * 2017-12-19 2021-09-15 Datalogic IP Tech S.r.l. User-wearable systems and methods to collect data and provide information
JP7171727B2 (en) 2017-12-20 2022-11-15 ビュージックス コーポレーション Augmented reality display system
CN108169901A (en) * 2017-12-27 2018-06-15 北京传嘉科技有限公司 VR glasses
US10360454B1 (en) 2017-12-28 2019-07-23 Rovi Guides, Inc. Systems and methods for presenting supplemental content in augmented reality
CN111433657A (en) * 2017-12-28 2020-07-17 深圳市柔宇科技有限公司 Diopter adjusting device and electronic equipment
US20190212699A1 (en) 2018-01-08 2019-07-11 Digilens, Inc. Methods for Fabricating Optical Waveguides
KR20190085368A (en) 2018-01-10 2019-07-18 삼성전자주식회사 Folding-type wearable electronic device with optical transfering member transfer to transparent member from projector
CN108122248B (en) * 2018-01-15 2020-04-24 武汉大学 Dam natural vibration frequency identification method based on video measurement
CN108459812B (en) * 2018-01-22 2021-03-02 郑州升达经贸管理学院 A system and method for art trajectory display and pursuit
US20210055560A1 (en) * 2018-01-26 2021-02-25 Tesseland Llc Compact optics in crossed configuration for virtual and mixed reality
US10956086B2 (en) * 2018-01-29 2021-03-23 Micron Technology, Inc. Memory controller
US11567627B2 (en) * 2018-01-30 2023-01-31 Magic Leap, Inc. Eclipse cursor for virtual content in mixed reality displays
CN108798360A (en) * 2018-02-01 2018-11-13 李绍辉 Smog Quick diffusing method based on the communication technology
CN108427830A (en) * 2018-02-09 2018-08-21 中建五局第三建设有限公司 Method and device for guiding spatial lofting of structure by using mixed reality technology
WO2019156839A1 (en) 2018-02-09 2019-08-15 Vuzix Corporation Image light guide with circular polarizer
US11093208B2 (en) 2018-02-16 2021-08-17 Valve Corporation Using detected pupil location to align optical components of a head-mounted display
WO2019165055A1 (en) 2018-02-22 2019-08-29 Magic Leap, Inc. Browser for mixed reality systems
US10735649B2 (en) 2018-02-22 2020-08-04 Magic Leap, Inc. Virtual and augmented reality systems and methods using display system control information embedded in image data
CN111801641B (en) 2018-02-22 2025-04-04 奇跃公司 Object creation system and method using physical manipulation
KR102546994B1 (en) * 2018-02-26 2023-06-22 엘지전자 주식회사 Wearable glass device
CN108479056B (en) * 2018-03-05 2021-12-31 江苏嘉尚环保科技有限公司 Online doll grabbing machine for blind people
US11563885B2 (en) 2018-03-06 2023-01-24 Eyedaptic, Inc. Adaptive system for autonomous machine learning and control in wearable augmented reality and virtual reality visual aids
JP7173126B2 (en) * 2018-03-14 2022-11-16 ソニーグループ株式会社 Information processing device, information processing method, and recording medium
US10695667B2 (en) * 2018-03-14 2020-06-30 Sony Interactive Entertainment LLC Pro gaming AR visor and method for parsing context specific HUD content from a video stream
KR102395445B1 (en) * 2018-03-26 2022-05-11 한국전자통신연구원 Electronic device for estimating position of sound source
CN108337573A (en) * 2018-03-26 2018-07-27 京东方科技集团股份有限公司 A kind of implementation method that race explains in real time and medium
WO2019185986A2 (en) * 2018-03-28 2019-10-03 Nokia Technologies Oy A method, an apparatus and a computer program product for virtual reality
CN108398791B (en) * 2018-03-29 2022-11-25 陈超平 Near-to-eye display device based on polarized contact lenses
AT521130A1 (en) * 2018-04-04 2019-10-15 Peterseil Thomas Method for displaying a virtual object
JP6368881B1 (en) * 2018-04-09 2018-08-01 チームラボ株式会社 Display control system, terminal device, computer program, and display control method
KR102063395B1 (en) 2018-04-10 2020-01-07 (주)세이프인 Virtual fire training simulator
CN108363522B (en) * 2018-04-24 2024-08-13 石家庄科达文教用品有限公司 Synchronous writing system and method thereof
CN112601509B (en) * 2018-05-29 2024-01-23 爱达扩视眼镜公司 Hybrid see-through augmented reality system and method for low vision users
WO2019236096A1 (en) 2018-06-08 2019-12-12 Hewlett-Packard Development Company, L.P. Computing input devices with sensors concealed in articles of clothing
CN108803877A (en) * 2018-06-11 2018-11-13 联想(北京)有限公司 Switching method, device and electronic equipment
FR3081639B1 (en) * 2018-06-11 2020-07-31 Orange OPTICAL DATA TRANSMISSION METHOD AND SYSTEM FOR VIRTUAL OR AUGMENTED REALITY APPLICATIONS
CN108982062B (en) * 2018-06-14 2020-04-21 上海卫星工程研究所 Visual field alignment method for linear array imaging optical load in satellite stray light test
CN108983636B (en) * 2018-06-20 2020-07-17 浙江大学 Human-machine intelligent symbiosis platform system
JP7175664B2 (en) * 2018-07-11 2022-11-21 克行 廣中 Voice conversation radio with light emitting function
TWI841254B (en) * 2018-07-12 2024-05-01 揚明光學股份有限公司 Optical device
TWI797142B (en) * 2018-07-12 2023-04-01 揚明光學股份有限公司 Optical device and fabrication method thereof
US10834986B2 (en) * 2018-07-12 2020-11-17 Sarah Nicole Ciccaglione Smart safety helmet with heads-up display
CN110794644B (en) * 2018-08-03 2023-02-24 扬明光学股份有限公司 Optical device and manufacturing method thereof
TWI830772B (en) * 2018-08-26 2024-02-01 以色列商魯姆斯有限公司 Near-eye displays
DE102018121258A1 (en) * 2018-08-30 2020-03-05 Vr Coaster Gmbh & Co. Kg Head-mounted display and amusement facility with such a head-mounted display
US11174022B2 (en) * 2018-09-17 2021-11-16 International Business Machines Corporation Smart device for personalized temperature control
EP3853691A1 (en) * 2018-09-18 2021-07-28 Transrobotics, Inc. Technologies for acting based on object detection
WO2020068721A1 (en) 2018-09-24 2020-04-02 Ecolab Usa Inc. Methods and compositions for pre-extractive beneficiation of ores
US12298519B2 (en) 2018-09-24 2025-05-13 Apple Inc. Display system with interchangeable lens
KR20210058964A (en) 2018-09-24 2021-05-24 아이답틱 인코포레이티드 Improved autonomous hands-free control in electronic visual aids
EP3864363B1 (en) 2018-10-12 2024-07-10 Armaments Research Company Inc. Firearm monitoring and remote support system
FR3087284B1 (en) * 2018-10-15 2021-11-05 Amadeus Sas AUGMENTED REALITY PROCESS AND SYSTEM
JP7044034B2 (en) * 2018-11-06 2022-03-30 トヨタ自動車株式会社 Information processing equipment, information processing methods and programs
CN109559541B (en) * 2018-11-20 2021-06-22 华东交通大学 Unmanned vehicle route management system
KR101942770B1 (en) 2018-11-29 2019-01-28 네이버시스템(주) Image processing system to synthesis revising photo image with location information
TWI687953B (en) * 2018-12-05 2020-03-11 宏碁股份有限公司 Key structure and mode switching method thereof
EP3663904A1 (en) * 2018-12-07 2020-06-10 Iristick nv Portable mobile mode for headset
GB201820117D0 (en) * 2018-12-11 2019-01-23 Rolls Royce Plc Inspection system
CN111310530B (en) * 2018-12-12 2023-06-30 百度在线网络技术(北京)有限公司 Sign language and voice conversion method and device, storage medium and terminal equipment
CN113196377B (en) * 2018-12-20 2024-04-12 Ns西日本株式会社 Display light emitting device, head-up display device, image display system, and helmet
CN109407325A (en) * 2018-12-21 2019-03-01 周桂兵 A kind of multipurpose VR intelligent glasses and its display methods
CN109808711B (en) * 2018-12-25 2020-07-07 南京师范大学 Automatic driving vehicle control method and system, automatic driving vehicle and visual prosthesis
TWI740083B (en) * 2018-12-27 2021-09-21 雅得近顯股份有限公司 Low-light environment display structure
CN109696747B (en) * 2019-01-16 2022-04-12 京东方科技集团股份有限公司 VR display device and control method thereof
DE102019202512A1 (en) * 2019-01-30 2020-07-30 Siemens Aktiengesellschaft Method and arrangement for outputting a HUD on an HMD
CN109886170B (en) * 2019-02-01 2019-12-17 河海大学 A kind of oncomelania intelligent detection identification and statistics system
KR102185519B1 (en) * 2019-02-13 2020-12-02 주식회사 싸이큐어 Method of garbling real-world image for direct encoding type see-through head mount display and direct encoding type see-through head mount display with real-world image garbling function
KR20200099047A (en) * 2019-02-13 2020-08-21 주식회사 싸이큐어 Method of garbling real-world image for see-through head mount display and see-through head mount display with real-world image garbling function
CN115202037B (en) * 2019-03-06 2024-12-10 株式会社理光 Optical device, retinal projection display device, head mounted display device
TWI711005B (en) * 2019-03-14 2020-11-21 宏碁股份有限公司 Method for adjusting luminance of images and computer program product
US11668945B2 (en) 2019-03-27 2023-06-06 Panasonic Intellectual Property Management Co., Ltd. Head-mounted display
EP3948747A4 (en) 2019-04-03 2022-07-20 Magic Leap, Inc. Managing and displaying webpages in a virtual three-dimensional space with a mixed reality system
CN110197601A (en) * 2019-04-24 2019-09-03 薄涛 Mixed reality glasses, mobile terminal and tutoring system, method and medium
US11758702B2 (en) 2019-04-30 2023-09-12 Apple Inc. Noise mitigation for head-mounted device
WO2020225747A1 (en) 2019-05-06 2020-11-12 Lumus Ltd. Transparent lightguide for viewing a scene and a near-eye display
CN110110458B (en) * 2019-05-14 2023-03-14 西安电子科技大学 Deformation conformal array antenna modeling method based on high-order moment method
CN110197142A (en) * 2019-05-16 2019-09-03 谷东科技有限公司 Object identification method, device, medium and terminal device under faint light condition
JP6641055B2 (en) * 2019-05-29 2020-02-05 株式会社東芝 Wearable terminal, system and display method
CN110175065A (en) * 2019-05-29 2019-08-27 广州视源电子科技股份有限公司 User interface display method, device, equipment and storage medium
CN110210390B (en) * 2019-05-31 2021-08-31 维沃移动通信有限公司 Fingerprint collection module, fingerprint collection method and terminal
TWI870411B (en) * 2019-06-04 2025-01-21 以色列商魯姆斯有限公司 Binocular type head mounted display system with adjustable interpupillary distance mechanism
WO2020251565A1 (en) * 2019-06-12 2020-12-17 Hewlett-Packard Development Company, L.P. Finger clip biometric virtual reality controllers
CN110276578A (en) * 2019-06-14 2019-09-24 武汉合创源科技有限公司 A kind of merchandise warehouse safety monitoring system and its method
KR102723835B1 (en) 2019-06-21 2024-10-31 애플 인크. Display and vision correction system with removable lenses
TWI870420B (en) * 2019-06-23 2025-01-21 以色列商魯姆斯有限公司 Display with foveated optical correction and method for displaying an image to an eye of a user
CN110363205B (en) * 2019-06-25 2021-06-22 浙江大学 An image feature extraction system and method based on Talbot effect optical convolution
TWI807066B (en) * 2019-07-08 2023-07-01 怡利電子工業股份有限公司 Glasses-free 3D reflective diffuser head-up display device
WO2021024524A1 (en) * 2019-08-06 2021-02-11 パナソニックIpマネジメント株式会社 Display device
CN110361707B (en) * 2019-08-09 2023-03-14 成都玖锦科技有限公司 Dynamic simulation method for motion state of radiation source
US11419516B2 (en) * 2019-08-26 2022-08-23 GE Precision Healthcare LLC MRI system comprising patient motion sensor
US11307416B2 (en) * 2019-08-28 2022-04-19 Lg Electronics Inc. Wearable electronic device on head
CN114450608A (en) 2019-08-29 2022-05-06 迪吉伦斯公司 Vacuum Bragg grating and method of manufacture
JP2022549408A (en) * 2019-09-05 2022-11-25 オープン レンズ プロジェクト リミテッド Systems and methods for managing digital media content
CN112462932B (en) * 2019-09-06 2025-01-10 苹果公司 Self-mixing interferometry-based gesture input system for wearable or handheld devices
US11617504B2 (en) 2019-09-18 2023-04-04 Verily Life Sciences Llc Retinal camera with dynamic illuminator for expanding eyebox
US12153505B2 (en) * 2019-10-01 2024-11-26 Weiland Innovations Llc Automated system for generating properly tagged training data for and verifying the efficacy of artificial intelligence algorithms
KR102824477B1 (en) * 2019-11-04 2025-06-24 엘지전자 주식회사 Multimedia device and method for controlling the same
CN114730068B (en) * 2019-11-13 2025-05-27 奇跃公司 Ambient light management system and method for wearable devices
KR102401854B1 (en) * 2019-11-29 2022-06-08 주식회사 카이비전 Augmented reality glass for multi-function
CN111160105A (en) * 2019-12-03 2020-05-15 北京文香信息技术有限公司 Video image monitoring method, device, equipment and storage medium
JP7170277B2 (en) * 2019-12-09 2022-11-14 株式会社辰巳菱機 Reporting device
CN111048215B (en) * 2019-12-13 2023-08-18 北京纵横无双科技有限公司 Medical video production method and system based on CRM
CN110865461A (en) * 2019-12-20 2020-03-06 西安睿雅赫工业科技合伙企业(普通合伙) A kind of smart glasses with internal visual display
CN110908122A (en) * 2019-12-20 2020-03-24 西安睿雅赫工业科技合伙企业(普通合伙) A split smart glasses
CN111179301B (en) * 2019-12-23 2023-06-30 北京中广上洋科技股份有限公司 Motion trend analysis method based on computer video
US12256057B2 (en) 2019-12-31 2025-03-18 ResMed Asia Pte. Ltd. Positioning, stabilising, and interfacing structures and system incorporating same
CN112327313B (en) * 2020-01-14 2024-03-29 必虎嘉骁光电技术(重庆)有限公司 Double-cylinder range finder
US11157086B2 (en) * 2020-01-28 2021-10-26 Pison Technology, Inc. Determining a geographical location based on human gestures
IT202000001786A1 (en) * 2020-01-30 2021-07-30 Ncc Italy Soc Cooperativa ASSISTANCE AND MONITORING KIT FOR THE SERVICE OFFERED BY THE DRIVER OF A VEHICLE
CN115039016A (en) * 2020-01-31 2022-09-09 微芯片技术股份有限公司 Head-up display using electrochromic element
CN111317257B (en) * 2020-03-25 2022-05-24 黑龙江工业学院 A kind of multimedia lecture table for special children's education
CN113874777B (en) * 2020-03-27 2023-01-10 瑞思迈私人有限公司 Head-mounted display system
US11686948B2 (en) 2020-03-27 2023-06-27 ResMed Pty Ltd Positioning, stabilising, and interfacing structures and system incorporating same
US11598967B2 (en) 2020-03-27 2023-03-07 ResMed Pty Ltd Positioning and stabilising structure and system incorporating same
US12178276B2 (en) 2020-03-27 2024-12-31 ResMed Pty Ltd Positioning and stabilising structure and system incorporating same
JP2021163287A (en) * 2020-03-31 2021-10-11 エイベックス・テクノロジーズ株式会社 Augmenting reality system
JP2021163499A (en) * 2020-03-31 2021-10-11 エイベックス・テクノロジーズ株式会社 Augmented reality system
CN114895464B (en) 2020-03-31 2025-05-09 优奈柯恩(北京)科技有限公司 Display Devices
CN111426283B (en) * 2020-04-14 2022-12-06 昆山金智汇坤建筑科技有限公司 Laser scanning equipment for building site measurement
EP4139716A1 (en) * 2020-04-21 2023-03-01 INOVA Ltd. Motion aware nodal seismic unit and related methods
US11915276B2 (en) * 2020-04-28 2024-02-27 Cisco Technology, Inc. System, method, and computer readable storage media for millimeter wave radar detection of physical actions coupled with an access point off-load control center
KR20230004553A (en) 2020-04-30 2023-01-06 루머스 리미티드 Optical sample characterization
US20230194881A1 (en) * 2020-05-29 2023-06-22 Vrmedia S.R.L. System for augmented reality
KR102498191B1 (en) * 2020-06-02 2023-02-10 주식회사 피앤씨솔루션 Optical system for augmented reality with a reflective surface and a head mounted display apparatus using thereof
CN111708170A (en) * 2020-07-10 2020-09-25 温州明镜智能科技有限公司 Novel VR glasses lens integrated configuration
US11513360B2 (en) 2020-07-17 2022-11-29 Toyota Research Institute, Inc. Enhanced contrast augmented reality (AR) tags for visual fiducial system
JP7442140B2 (en) * 2020-09-10 2024-03-04 公益財団法人鉄道総合技術研究所 Computer system and control method
JP2023542645A (en) * 2020-09-11 2023-10-11 フルークコーポレイション System and method for acoustic imaging using cumulative time views
CN112565720A (en) * 2020-09-17 2021-03-26 苏州恒创文化传播有限公司 3D projection system based on holographic technology
KR20230079411A (en) 2020-09-30 2023-06-07 스냅 인코포레이티드 Multipurpose cameras for augmented reality and computer vision applications
US12267585B2 (en) 2020-09-30 2025-04-01 Snap Inc. Ultra low power camera pipeline for CV in AR systems
JP7562061B2 (en) * 2020-11-22 2024-10-07 斉 永岡 Projection function, smart glasses with display, and output terminal
CN112370240A (en) * 2020-12-01 2021-02-19 創啟社會科技有限公司 Auxiliary intelligent glasses and system for vision impairment and control method thereof
IT202000028787A1 (en) * 2020-12-01 2022-06-01 Virtual Job SYSTEM AND METHOD OF USING COMPUTER SOFTWARE AND HARDWARE COMPONENTS FOR LEARNING SAFETY PROCEDURES IN THE WORKPLACE CHARACTERIZED BY THE USE OF VIRTUAL REALITY
DE102020215285A1 (en) 2020-12-03 2022-06-09 Robert Bosch Gesellschaft mit beschränkter Haftung Method for parallax control and binocular data glasses with a computing unit for carrying out the method
CN112807654B (en) * 2020-12-05 2021-12-17 淮北禾获人科技有限公司 Electronic judgment platform and method for walking competition
JP2024504261A (en) * 2020-12-17 2024-01-31 ルメヌイティ、エルエルシー Method and system for image correction and processing in high magnification photography utilizing partial reflectors
EP4252048A4 (en) * 2020-12-21 2024-10-16 Digilens Inc. EYEGLOW SUPPRESSION IN WAVEGUIDE-BASED DISPLAYS
CN112904803B (en) * 2021-01-15 2022-05-03 西安电子科技大学 Multi-splicing-surface deformation and flatness fine adjustment system, method, equipment and application
JP2022113031A (en) * 2021-01-22 2022-08-03 ソフトバンク株式会社 Control device, program, system, and control method
TWI769815B (en) * 2021-02-03 2022-07-01 大立光電股份有限公司 Plastic light-folding element, imaging lens assembly module and electronic device
CN112819590B (en) * 2021-02-25 2023-03-10 紫光云技术有限公司 Method for managing product configuration information in cloud product service delivery process
JP7540364B2 (en) * 2021-02-26 2024-08-27 セイコーエプソン株式会社 Optical module and head-mounted display device
US12073641B2 (en) 2021-03-31 2024-08-27 Arm Limited Systems, devices, and/or processes for dynamic surface marking
US11995904B2 (en) 2021-03-31 2024-05-28 Arm Limited Systems, devices, and/or processes for dynamic surface marking
CN117222932A (en) 2021-03-31 2023-12-12 斯纳普公司 Eye wearable device projector brightness control
WO2022207145A1 (en) * 2021-03-31 2022-10-06 Arm Limited Systems, devices, and/or processes for dynamic surface marking
US12073640B2 (en) 2021-03-31 2024-08-27 Arm Limited Systems, devices, and/or processes for dynamic surface marking
CN113064280A (en) * 2021-04-08 2021-07-02 恒玄科技(上海)股份有限公司 Intelligent display device
US11892624B2 (en) * 2021-04-27 2024-02-06 Microsoft Technology Licensing, Llc Indicating an off-screen target
CN113240818A (en) * 2021-04-29 2021-08-10 广东元一科技实业有限公司 Method for simulating and displaying dummy model clothes
KR20220151420A (en) * 2021-05-06 2022-11-15 삼성전자주식회사 Wearable electronic device and method for outputting 3d image
CN113112183B (en) * 2021-05-06 2024-03-19 国家市场监督管理总局信息中心 Method, system and readable storage medium for risk assessment of entry and exit dangerous goods
US12003697B2 (en) 2021-05-06 2024-06-04 Samsung Electronics Co., Ltd. Wearable electronic device and method of outputting three-dimensional image
CN113115008B (en) * 2021-05-17 2023-05-19 哈尔滨商业大学 A pipe gallery master-slave operation inspection system and method
KR102337907B1 (en) * 2021-05-20 2021-12-09 주식회사 아진엑스텍 Augmented reality smart glass device
GB2608186B (en) * 2021-06-25 2026-01-07 Thermoteknix Systems Ltd Augmented Reality System
TR2021010513A1 (en) * 2021-06-28 2023-01-23 Kaitek Yazilim Elektronik Bilg San Tic Ltd Sti A system that performs mass production process analysis with an eye-tracking and accelerometer-enabled mixed reality headset
CN113569645B (en) * 2021-06-28 2024-03-22 广东技术师范大学 Track generation method, device and system based on image detection
CN113503867A (en) * 2021-07-16 2021-10-15 吕国浩 Novel measurement lofting instrument
KR102321470B1 (en) * 2021-07-28 2021-11-03 주식회사 셀리코 Electrochromic layer based vision aid device and vision aid glasses comprising thereof
CN113576560B (en) * 2021-08-19 2025-02-18 北京航天总医院 Holographic thoracoscopy system
HU231709B1 (en) * 2021-08-31 2025-10-28 Pázmány Péter Katolikus Egyetem Augmented reality based system and method
CN113867007A (en) * 2021-09-01 2021-12-31 中联通服(北京)通讯技术有限公司 A front and rear adjustable structure applied to smart glasses
CN113820862B (en) 2021-09-10 2023-06-27 维沃移动通信有限公司 Optical lens and optical glasses
US20230077780A1 (en) * 2021-09-16 2023-03-16 International Business Machines Corporation Audio command corroboration and approval
CN115967855B (en) * 2021-10-08 2024-12-27 台中科技大学 Synchronous live broadcast teaching device integrating real-time video recording and screenshot functions
CN114115453B (en) * 2021-10-21 2024-02-09 维沃移动通信有限公司 Electronic equipment
KR102362038B1 (en) * 2021-10-27 2022-02-14 김종찬 Film screening screen device for train vehicles
CN114047829B (en) * 2021-10-28 2024-11-22 西安微电子技术研究所 A keyboard and mouse device sharing method
CN114030355A (en) * 2021-11-15 2022-02-11 智己汽车科技有限公司 Vehicle control method and device, vehicle and medium
KR102631231B1 (en) * 2021-11-17 2024-01-31 주식회사 피앤씨솔루션 Ar glasses apparatus with protective cover and protective cover for ar glass apparatus
GB202116754D0 (en) * 2021-11-19 2022-01-05 Sensivision Ltd Handheld guidance device for the visually-impaired
JP2023080813A (en) * 2021-11-30 2023-06-09 キヤノン株式会社 WEARABLE DEVICE, CONTROLLER, SYSTEM, CONTROL METHOD, PROGRAM
WO2023107251A1 (en) * 2021-12-06 2023-06-15 Lumileds Llc Optical filters compensating for changes in performance of next generation leds compared to legacy devices
US11914093B2 (en) * 2021-12-07 2024-02-27 Microsoft Technology Licensing, Llc RF antenna scanning for human movement classification
CN114399993A (en) * 2021-12-20 2022-04-26 南京模拟技术研究所 Standard law enforcement training system
US12337845B1 (en) * 2021-12-23 2025-06-24 United Services Automobile Association (Usaa) Method and system for automatically detecting a vehicle accident
US12088781B2 (en) * 2021-12-30 2024-09-10 Snap Inc. Hyper-connected and synchronized AR glasses
US20230222197A1 (en) * 2022-01-07 2023-07-13 Jumio Corporation Biometric Authentication Using Head-Mounted Devices
US20250069402A1 (en) * 2022-01-13 2025-02-27 Nec Corporation Information processing apparatus, information processing method, and non-transitory storage medium
US12313845B2 (en) 2022-01-24 2025-05-27 Microsoft Technology Licensing, Llc Illuminating spatial light modulator with LED array
US12282963B1 (en) 2022-02-08 2025-04-22 United Services Automobile Association (Usaa) Automatedly generating and issuing accident data gathering recommendations following a vehicle accident
US12205456B1 (en) 2022-03-01 2025-01-21 United Services Automobile Association (Usaa) Automatic vehicle accident notifications within a distributed network of recipients
JP7758160B2 (en) * 2022-03-18 2025-10-22 日本電気株式会社 Firefighting activity support device, firefighting activity support method, and firefighting activity support program
CN118451367A (en) * 2022-03-25 2024-08-06 华为技术有限公司 Electronic communication device capable of image projection
US20230306499A1 (en) * 2022-03-28 2023-09-28 Google Llc Vision-powered auto-scroll for lists
US11556010B1 (en) * 2022-04-01 2023-01-17 Wen-Tsun Wu Mini display device
CN114995856B (en) * 2022-06-20 2024-08-16 中国航空工业集团公司沈阳飞机设计研究所 Data upgrading method of ground intelligent system and airborne intelligent system
US12092822B2 (en) 2022-06-30 2024-09-17 Vuzix Corporation Multi-antenna augmented reality display
CN115278021B (en) * 2022-07-27 2024-11-19 唐山学院 An Internet of Things image acquisition device
US12105926B2 (en) * 2022-07-28 2024-10-01 Ntt Docomo, Inc. XR manipulation feature with smart watch
CN115049643A (en) * 2022-08-11 2022-09-13 武汉精立电子技术有限公司 Near-to-eye display module interlayer foreign matter detection method, device, equipment and storage medium
US20240064420A1 (en) * 2022-08-18 2024-02-22 Apple Inc. Cameras for multiple views
WO2024070729A1 (en) * 2022-09-28 2024-04-04 パナソニックIpマネジメント株式会社 Wearable device
US12198641B2 (en) 2022-11-23 2025-01-14 Samsung Electronics Co., Ltd. Head-mounted electronic device and method for operating the same
WO2024112017A1 (en) * 2022-11-23 2024-05-30 삼성전자 주식회사 Head-worn electronic device and operation method thereof
US12229316B1 (en) 2022-12-22 2025-02-18 United Services Automobile Association (Usaa) Anonymized transfer of personally identifiable information
WO2024162501A1 (en) * 2023-02-02 2024-08-08 엘지전자 주식회사 Electronic device
DE112024000400T5 (en) * 2023-02-15 2025-10-09 Ams-Osram International Gmbh EYETRACKER MODULE, EYETRACKING METHOD AND EYETRACKER SYSTEM
JP2024116742A (en) 2023-02-16 2024-08-28 キヤノン株式会社 Information Processing System
US12401738B2 (en) * 2023-02-22 2025-08-26 Szu Cheng Ma Wi-Fi hotspot translation mobile phone
CN116389674B (en) * 2023-03-28 2024-04-26 射阳港海会议服务有限公司 Remote conference video device
CN116186418B (en) * 2023-04-27 2023-07-04 深圳市夜行人科技有限公司 Low-light imaging system recommendation method, system and medium
CN116244238B (en) * 2023-05-12 2023-07-18 中国船舶集团有限公司第七〇七研究所 RS422 protocol and RS232 protocol compatible method and circuit for fiber optic gyroscope
CN121153002A (en) * 2023-05-16 2025-12-16 大众汽车股份公司 Method and device for the comfort-optimized adjustment of the brightness and/or volume in a vehicle using VR glasses
JP2025011383A (en) * 2023-07-11 2025-01-24 株式会社リコー Imaging device, transmission method, and program
WO2025022246A1 (en) * 2023-07-25 2025-01-30 株式会社半導体エネルギー研究所 Electronic device and method for operating same
CN117195738B (en) * 2023-09-27 2024-03-12 广东翼景信息科技有限公司 Base station antenna setting and upper dip angle optimizing method for unmanned aerial vehicle corridor
WO2025082571A1 (en) * 2023-10-20 2025-04-24 Continental Automotive Technologies GmbH Device for generating a virtual image with a speckle-reducing light mixing rod
GB2640264A (en) * 2024-04-09 2025-10-15 Envisics Ltd Light control device
US20250325219A1 (en) * 2024-04-17 2025-10-23 Natus Acquisition Ii, Llc Dynamic Posturography Apparatus with Tunable Optics
KR102772223B1 (en) * 2024-08-30 2025-02-25 김상화 Smart glass display device
CN119601169B (en) * 2024-11-28 2025-06-10 扬中市人民医院 Intelligent positioning system and method convenient for adjusting standing posture of patient
CN120235012B (en) * 2025-05-29 2025-08-26 中国科学院长春光学精密机械与物理研究所 Laser damage threshold analysis method
CN120439523B (en) * 2025-07-14 2025-10-17 深圳市博硕科技股份有限公司 Injection mold is implanted to FPC circuit board

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101243392A (en) * 2005-08-15 2008-08-13 皇家飞利浦电子股份有限公司 Systems, devices and methods for end-user programmed augmented reality glasses
JP2009222774A (en) * 2008-03-13 2009-10-01 Fujifilm Corp Digital content reproduction device and reproduction control method for digital content
WO2011106798A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09139927A (en) * 1995-11-15 1997-05-27 Matsushita Electric Ind Co Ltd Multipoint image transmission device
JP3921915B2 (en) * 2000-03-22 2007-05-30 松下電器産業株式会社 Display device
WO2005122128A1 (en) * 2004-06-10 2005-12-22 Matsushita Electric Industrial Co., Ltd. Wearable type information presentation device
JP4635572B2 (en) * 2004-11-09 2011-02-23 コニカミノルタホールディングス株式会社 Video display device
JP2008176681A (en) * 2007-01-22 2008-07-31 Fujifilm Corp Eyeglass-type communication support device
JP5309448B2 (en) * 2007-01-26 2013-10-09 ソニー株式会社 Display device and display method
KR101576567B1 (en) * 2009-12-04 2015-12-10 한국전자통신연구원 gesture input apparatus and gesture recognition method and apparatus using the same
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
JP6211144B1 (en) * 2016-07-04 2017-10-11 株式会社コロプラ Display control method and program for causing a computer to execute the display control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101243392A (en) * 2005-08-15 2008-08-13 皇家飞利浦电子股份有限公司 Systems, devices and methods for end-user programmed augmented reality glasses
JP2009222774A (en) * 2008-03-13 2009-10-01 Fujifilm Corp Digital content reproduction device and reproduction control method for digital content
WO2011106798A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI687721B (en) 2010-11-08 2020-03-11 盧森堡商喜瑞爾工業公司 Display device
EP3444667B1 (en) * 2013-05-09 2023-12-06 IMAX Theatres International Limited Methods and systems of vibrating a screen
US20210200845A1 (en) * 2019-12-31 2021-07-01 Atlassian Pty Ltd. Illumination-based user authentication
US11983256B2 (en) * 2019-12-31 2024-05-14 Atlassian Pty Ltd. Illumination-based user authentication
WO2023076841A1 (en) * 2021-10-25 2023-05-04 Atieva, Inc. Contextual vehicle control with visual representation
US12502962B2 (en) 2021-10-25 2025-12-23 Atieva, Inc. Contextual vehicle control with visual representation
US20230186800A1 (en) * 2021-12-15 2023-06-15 Motorola Mobility Llc Augmented reality display device having contextual adaptive brightness
US11908356B2 (en) * 2021-12-15 2024-02-20 Motorola Mobility Llc Augmented reality display device having contextual adaptive brightness
US11800246B2 (en) 2022-02-01 2023-10-24 Landscan Llc Systems and methods for multispectral landscape mapping
US12231785B2 (en) 2022-02-01 2025-02-18 Landscan Llc Systems and methods for multispectral landscape mapping

Also Published As

Publication number Publication date
EP2761362A2 (en) 2014-08-06
EP2761362A4 (en) 2014-08-06
CN103946732A (en) 2014-07-23
WO2013049248A3 (en) 2013-07-04
WO2013049248A2 (en) 2013-04-04
KR20140066258A (en) 2014-05-30
JP2015504616A (en) 2015-02-12

Similar Documents

Publication Publication Date Title
CN103946732B (en) Video display modification based on sensor input to see-through, near-eye displays
US11275482B2 (en) Ar glasses with predictive control of external device based on event input
US8964298B2 (en) Video display modification based on sensor input for a see-through near-to-eye display
US20200192089A1 (en) Head-worn adaptive display
US9341843B2 (en) See-through near-eye display glasses with a small scale image source
US9366862B2 (en) System and method for delivering content to a group of see-through near eye display eyepieces
US9223134B2 (en) Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US8467133B2 (en) See-through display with an optical assembly including a wedge-shaped illumination system
US8482859B2 (en) See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US9097891B2 (en) See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9129295B2 (en) See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9182596B2 (en) See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US8472120B2 (en) See-through near-eye display glasses with a small scale image source
US8477425B2 (en) See-through near-eye display glasses including a partially reflective, partially transmitting optical element
US9229227B2 (en) See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9134534B2 (en) See-through near-eye display glasses including a modular image source
US9097890B2 (en) Grating in a light transmissive illumination system for see-through near-eye display glasses
US8488246B2 (en) See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
US20130278631A1 (en) 3d positioning of augmented reality information
US20190025587A1 (en) Ar glasses with event and user action control of external applications
US20170344114A1 (en) Ar glasses with predictive control of external device based on event input
US20160187654A1 (en) See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US20120212499A1 (en) System and method for display content control during glasses movement
US20120212484A1 (en) System and method for display content placement using distance and location information
US20120242698A1 (en) See-through near-eye display glasses with a multi-segment processor-controlled optical layer

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: MICROSOFT TECHNOLOGY LICENSING LLC

Free format text: FORMER OWNER: MICROSOFT CORP.

Effective date: 20150727

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20150727

Address after: Washington State

Applicant after: MICROSOFT TECHNOLOGY LICENSING, LLC

Address before: Washington State

Applicant before: Microsoft Corp.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190614