[go: up one dir, main page]

CN121299871A - Multi-part lens frame - Google Patents

Multi-part lens frame

Info

Publication number
CN121299871A
CN121299871A CN202510914848.3A CN202510914848A CN121299871A CN 121299871 A CN121299871 A CN 121299871A CN 202510914848 A CN202510914848 A CN 202510914848A CN 121299871 A CN121299871 A CN 121299871A
Authority
CN
China
Prior art keywords
lens
frame portion
head
frame
supplementary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202510914848.3A
Other languages
Chinese (zh)
Inventor
T·T·凯尔
贺吟涓
K·A·巴比亚兹
A·N·齐默尔曼
J·L·昂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN121299871A publication Critical patent/CN121299871A/en
Pending legal-status Critical Current

Links

Landscapes

  • Lens Barrels (AREA)

Abstract

本发明涉及“多部件透镜框架”。一种头戴式设备可具有模块,每个模块具有产生图像的显示器和将图像提供给适眼区以供用户查看的对应的透镜。光学模块可各自包括透镜镜筒,该显示器和透镜安装到该透镜镜筒。可移除的补充透镜诸如处方透镜可联接到光学模块中的一个或两个光学模块。补充透镜可安装在包括多个框架部分的透镜组件中。透镜可包括插置在框架部分之间的凸缘。框架部分可粘合地附接和/或扣合在一起。透镜部分中的一个或多个透镜部分和透镜可具有用于将框架部分与透镜对准的配合特征部。通过用多个框架部分来形成透镜组件,可以减小透镜和框架部分上的应力。

This invention relates to a "multi-component lens frame." A head-mounted device may have modules, each module having a display that generates an image and a corresponding lens that provides the image to an eye-fitting area for viewing by a user. Each optical module may include a lens barrel to which the display and lens are mounted. Removable supplementary lenses, such as prescription lenses, may be coupled to one or both optical modules. The supplementary lenses may be mounted in a lens assembly comprising multiple frame portions. Lenses may include flanges inserted between the frame portions. The frame portions may be adhesively attached and/or snapped together. One or more lens portions and lenses in the lens assembly may have mating features for aligning the frame portions with the lenses. By forming the lens assembly with multiple frame portions, stress on the lenses and frame portions can be reduced.

Description

Multi-component lens frame
The present application claims priority from U.S. patent application Ser. No. 19/215,197, filed on 21 at 5 of 2025, and U.S. provisional patent application Ser. No. 63/669,064, filed on 9 at 7 of 2024, which are hereby incorporated by reference in their entireties.
Technical Field
The present invention relates generally to electronic devices, including wearable electronic devices, such as head-mounted devices.
Background
An electronic device, such as a head-mounted device, may have a display for displaying images. The display may be housed in an optical module. When the head-mounted device is worn on the user's head, the user may view the displayed image.
Disclosure of Invention
The headset may have a left eye optical module and a right eye optical module. Each optical module may have a display that produces an image and a corresponding lens that provides the image to an associated eyebox for viewing by a user. The optical modules may each include a lens barrel in which the display and lenses of the optical module are mounted.
A removable supplemental lens, such as a prescription lens, may be coupled to one or both of the optical modules. The supplemental lens may be mounted in a lens assembly comprising a plurality of frame portions. The lens may include a flange interposed between the frame portions. One or more of the lens portions and the lens may have mating features for aligning the frame portion with the lens. Alternatively or additionally, one or more of the frame portions may comprise one or more polymer portions. The flange may contact the one or more polymer portions.
The frame portions may be adhesively attached and/or snapped together. By forming the lens assembly with a plurality of frame portions, stresses on the lens and frame portions may be reduced.
One or more electronic components may be formed in or on one or more of the frame portions. For example, a wireless communication circuit, such as a Near Field Communication (NFC) circuit, may be formed on one of the frame portions. The NFC circuit may include an NFC chip mounted on the frame portion and an NFC coil wound around the frame portion.
Drawings
Fig. 1 is a top view of an exemplary head mounted device according to some embodiments.
Fig. 2 is a rear view of an exemplary head mounted device according to some embodiments.
Fig. 3 is a schematic diagram of an exemplary head mounted device according to some embodiments.
Fig. 4 is a cross-sectional side view of an exemplary optical module having a removable supplemental lens in a supplemental lens assembly, according to some embodiments.
Fig. 5 is an exploded view of an exemplary supplemental lens assembly including two frame portions coupled with an adhesive, according to some embodiments.
Fig. 6 is a side view of an exemplary supplemental lens assembly including two frame portions coupled with an adhesive, according to some embodiments.
Fig. 7 is an exploded view of an exemplary supplemental lens assembly including two frame portions snapped together, according to some embodiments.
Fig. 8 is a side view of an exemplary supplemental lens assembly including two frame portions snapped together according to some embodiments.
Fig. 9 is an exploded view of an exemplary supplemental lens assembly including three frame portions snapped together, according to some embodiments.
Fig. 10 is a side view of an exemplary supplemental lens assembly including three frame portions snapped together according to some embodiments.
Fig. 11 is a side view of an exemplary supplemental lens assembly including a plurality of frame portions and wireless communication circuitry formed on one of the frame portions, according to some embodiments.
Fig. 12 is a perspective view of an exemplary frame portion with wireless communication circuitry according to some embodiments.
Detailed Description
An electronic device, such as a head-mounted device, may have a front face facing away from a user's head, and may have an opposite back face facing the user's head. The optical module on the back side may be used to provide an image to the user's eyes.
In some embodiments, the head-mounted device may include one or more lenses that provide an image to the user's eyes. In addition, it may be desirable to include one or more supplemental lenses that may be added to the device. For example, the supplemental lens may be a prescription lens that may clip onto, magnetically attach to, or otherwise couple to the head-mounted device. For example, these prescription lenses may be customized according to the user's prescription.
The supplemental lenses may be mounted in a lens frame to form a lens assembly. To prevent damage to the supplemental lens when the supplemental lens is installed in the lens frame, the lens frame may be a multi-component lens frame assembled around the supplemental lens. For example, a supplemental lens may be inserted into the first portion of the lens frame, and one or more additional portions of the lens frame may be glued, snapped, or otherwise attached to the first portion of the lens frame to mount the lens within the multi-component lens frame. In this way, the supplemental lens may be mounted in the lens frame without subjecting the supplemental lens to unnecessary stress/strain.
A top view of an exemplary head mounted device is shown in fig. 1. As shown in fig. 1, a head-mounted device such as electronic device 10 may have a head-mounted support structure such as a housing 12. The housing 12 may include a portion (e.g., a support structure 12T) for allowing the device 10 to be worn on the head of a user. The support structure 12T may be formed of fabric, polymer, metal, and/or other materials. The support structure 12T may form a strap or other head-mounted support structure to help support the device 10 on the user's head. The main support structure of the housing 12 (e.g., the main housing portion 12M) may support electronic components such as the display 14. The main housing portion 12M may include a housing structure formed of metal, polymer, glass, ceramic, and/or other materials. For example, the housing portion 12M may have housing walls on the front face F and housing walls on adjacent top, bottom, left and right sides, formed of a rigid polymer or other rigid support structure, and optionally covered with electronic components, fabric, leather or other soft material, or the like. The walls of the housing portion 12M may enclose the interior components 38 in the interior region 34 of the device 10 and may separate the interior region 34 from the environment surrounding the device 10 (the exterior region 36). Internal components 38 may include integrated circuits, actuators, batteries, sensors, and/or other circuitry and structures for device 10. The housing 12 may be configured to be worn on the head of a user and may form glasses, hats, helmets, goggles, and/or another head-mounted device. Configurations in which the housing 12 forms goggles or glasses are sometimes described herein as examples.
The front face F of the housing 12 may face outwardly away from the user's head and face. The opposite back R of the housing 12 may face the user. The portion of the housing 12 (e.g., the portion of the main housing 12M) that is located on the back face R may form a cover, such as the cover 12C (sometimes referred to as a blind). The presence of the cover 12C on the back surface R may help to conceal the internal housing structure, the internal components 38, and other structures in the interior region 34 from view by the user.
The device 10 may have a left optical module and a right optical module 40. Each optical module may include a respective display 14, a lens 30, and a support structure 32. The support structure 32, which may sometimes be referred to as a lens barrel or optical module support structure, may comprise a hollow cylindrical structure having an open end or other support structure for housing the display 14 and the lens 30. The support structure 32 may, for example, include a left lens barrel that supports the left display 14 and the left lens 30 and a right lens barrel that supports the right display 14 and the right lens 30.
Display 14 may include an array of pixels or other display device to produce an image. The display 14 may include, for example, organic light emitting diode pixels formed on a substrate with thin film circuitry and/or formed on a semiconductor substrate, pixels formed from crystalline semiconductor die, liquid crystal display pixels, scanning display devices, and/or other display devices for producing images.
The lens 30 may include one or more lens elements for providing image light from the display 14 to the respective eyebox 13. The lenses may be implemented using refractive glass lens elements, using mirror lens structures (catadioptric lenses), using fresnel lenses, using holographic lenses, and/or other lens systems.
When the user's eyes are located in the eyebox 13, the displays (display panels) 14 operate together to form a display of the device 10 (e.g., the user's eyes may view images provided by the respective left and right optical modules 40 in the eyebox 13 so that stereoscopic images are created for the user). When the user views the display, the left image from the left optical module merges with the right image from the right optical module. The images provided to the eyebox 13 may provide a virtual reality environment, an augmented reality environment, and/or a mixed reality environment for the user (e.g., different environments may be used to display different content to the user at different times). For example, in some embodiments, display 14 may be a see-through display (e.g., a display including a projector and an optical assembly that combines light from the projector and light from outside device 10).
It may be desirable to monitor the user's eyes while they are in the eyebox 13. For example, it may be desirable to use a camera to capture an image of the iris of the user (or other portion of the user's eye) for user authentication. It may also be desirable to monitor the direction of the user's gaze. The gaze tracking information may be used as a form of user input and/or may be used to determine where within the image content resolution should be locally enhanced in the foveal imaging system. To ensure that the device 10 can capture satisfactory eye images when the user's eyes are in the eyebox 13, each optical module 40 may be provided with a camera such as a camera 42 and one or more light sources such as light emitting diodes 44 (e.g., lasers, lights, etc.). If desired, a plurality of cameras 42 may be provided in each optical module 40.
The camera 42 and the light emitting diode 44 may operate at any suitable wavelength (visible, infrared, and/or ultraviolet). Under the illustrative configuration described herein, sometimes by way of example, diode 44 emits infrared or near infrared light that is invisible (or nearly invisible) to the user, such as near infrared light of 950nm or 840 nm. This allows the eye monitoring operation to be performed continuously without interfering with the user's ability to view images on the display 14.
Not all users have the same inter-pupillary distance IPD. To provide the ability for device 10 to adjust the inter-pupillary distance between modules 40, and thus the spacing IPD between eyebox 13, along lateral dimension X to accommodate different user inter-pupillary distances, device 10 may be provided with an actuator 43. The actuators 43 may be manually controlled and/or computer controlled actuators (e.g., computer controlled motors) for moving the support structures 32 relative to one another. Information about the position of the user's eyes may be collected using, for example, camera 42. The position of the eyebox 13 may then be adjusted accordingly.
As shown in fig. 2, cover 12C may cover back R while leaving lens 30 of optical module 40 uncovered (e.g., cover 12C may have an opening aligned with and receiving module 40). As modules 40 move relative to one another along dimension X to accommodate different interpupillary distances of different users, modules 40 move relative to a fixed housing structure, such as the wall of main portion 12M, and move relative to one another.
A schematic diagram of an exemplary electronic device, such as a head mounted device or other wearable device, is shown in fig. 3. The device 10 of fig. 3 may operate as a standalone device and/or the resources of the device 10 may be used to communicate with external electronic equipment. For example, communication circuitry in device 10 may be used to transmit user input information, sensor information, and/or other information to an external electronic device (e.g., wirelessly or via a wired connection). Each of these external devices may include components of the type shown in device 10 of fig. 3.
As shown in fig. 3, a head mounted device such as device 10 may include control circuitry 20 (also referred to herein as controller 20). Control circuitry 20 may include storage and processing circuitry for supporting the operation of device 10. The storage and processing circuitry may include storage devices such as non-volatile memory (e.g., flash memory or other electrically programmable read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random access memory), and the like. Processing circuitry in the control circuit 20 may be used to collect inputs from sensors and other input devices and to control output devices. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors and other wireless communication circuits, power management units, audio chips, application specific integrated circuits, and the like. During operation, control circuitry 20 may provide visual and other outputs to a user using display 14 and other output devices.
To support communication between the device 10 and external equipment, the control circuit 20 may communicate using the communication circuit 22. The circuitry 22 may include an antenna, radio frequency transceiver circuitry, and other wireless and/or wired communication circuitry. Circuitry 22 (which may sometimes be referred to as control circuitry and/or control and communication circuitry) may support bi-directional wireless communication between device 10 and external equipment (e.g., a companion device such as a computer, cellular telephone or other electronic device, an accessory such as a pointing device, a computer stylus or other input device, speakers or other output device, etc.) via a wireless link. For example, the circuitry 22 may include radio frequency transceiver circuitry such as wireless local area network transceiver circuitry configured to support communication via a wireless local area network link, near Field Communication (NFC) transceiver circuitry configured to support communication via a near field communication link, cellular telephone transceiver circuitry configured to support communication via a cellular telephone link, or transceiver circuitry configured to support communication via any other suitable wired or wireless communication link. For example, it may be viaA link(s),The link, wireless link operating at frequencies between 10GHz and 400GHz, 60GHz link or other millimeter wave link, cellular telephone link, or other wireless communication link supports wireless communications. Device 10 may include power circuitry for transmitting and/or receiving wired and/or wireless power, if desired, and may include a battery or other energy storage device. For example, the device 10 may include a coil and a rectifier to receive wireless power provided to circuitry in the device 10.
Device 10 may include an input-output device, such as device 24. The input-output device 24 may be used to gather user input, to gather information about the user's surroundings, and/or to provide output to the user. Device 24 may include one or more displays, such as display 14. The display 14 may include one or more display devices such as an organic light emitting diode display panel (a panel with organic light emitting diode pixels formed on a polymer substrate or silicon substrate containing pixel control circuitry), a liquid crystal display panel, a microelectromechanical system display (e.g., a two-dimensional mirror array or scanning mirror display device), a display panel with an array of pixels formed of crystalline semiconductor light emitting diode dies (sometimes referred to as micro-LEDs), and/or other display devices.
The sensors 16 in the input-output device 24 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors (such as microphones), touch and/or proximity sensors (such as capacitive sensors, such as touch sensors forming buttons, touch pads, or other input devices), and other sensors. If desired, the sensors 16 may include optical sensors (such as optical sensors that emit and detect light), ultrasonic sensors, optical touch sensors, optical proximity sensors and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, fingerprint sensors, iris scan sensors, retinal scan sensors and other biometric sensors, temperature sensors, sensors for measuring three-dimensional contactless gestures ("air gestures"), pressure sensors, sensors for detecting position, orientation and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that include some or all of these sensors), health sensors such as blood oxygen sensors, heart rate sensors, blood flow sensors and/or other health sensors, radio frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereoscopic imaging devices that capture three-dimensional images), optical sensors such as self-mixing time-of-flight measurements and light probes and ranging sensors, humidity sensors, gaze sensors, humidity sensors, muscle tracking sensors, moisture sensors, or other sensors, moisture sensors, and/or other active muscle sensors. In some arrangements, the device 10 may use the sensor 16 and/or other input-output devices to gather user input. For example, buttons may be used to gather button press inputs, touch sensors overlapping the display may be used to gather user touch screen inputs, a touch pad may be used to gather touch inputs, a microphone may be used to gather audio inputs, an accelerometer may be used to monitor when a finger contacts the input surface and thus may be used to gather finger press inputs, and so on.
If desired, the electronic device 10 may include additional components (see, e.g., other devices 18 in the input-output device 24). Additional components may include a haptic output device, an actuator for moving the movable housing structure, an audio output device such as a speaker, a light emitting diode for a status indicator, a light source such as a light emitting diode that illuminates portions of the housing and/or display structure, other optical output devices, and/or other circuitry for gathering input and/or providing output. The device 10 may also include a battery or other energy storage device, a connector port for supporting wired communication with auxiliary equipment, and for receiving wired power, as well as other circuitry.
A cross-sectional side view of an exemplary optical module for use in apparatus 10 is shown in fig. 4. As shown in fig. 4, the optical module 40 may have a lens barrel 32. Lens 30 may be used to provide an image along optical axis 60 from pixel P of display 14 to eyebox 13. In particular, lens 30 may focus and/or otherwise direct light emitted by pixels P to eyebox 13 to present an image to eyebox 13.
In addition to the lens 30, a supplemental lens 66 may also be coupled to the lens barrel 32. The supplemental lens 66 may be mounted in a lens frame 72 to form a supplemental lens assembly 70. The supplemental lens assembly 70 may be clipped onto the lens barrel 32, magnetically attached to the lens barrel 32, or otherwise removably coupled to the lens barrel 32. In the illustrative example of fig. 4, the supplemental lens assembly 70 may be removably coupled to the lens barrel 32 on an outer edge of the lens barrel 32. However, this is merely illustrative. In general, the supplemental lens assembly 70 may be coupled to any suitable portion of the lens barrel 32, such as within the lens barrel 32 (e.g., between side walls of the lens barrel 32), or the supplemental lens assembly 70 may be spaced apart from the lens barrel 32 while remaining interposed between the eyebox 13 and the display 14.
In some embodiments, the supplemental lens 66 may be a glass lens, a polycarbonate lens, or a lens of any other suitable material. The supplemental lens 66 may have a prescription that matches the prescription of the user of the device 10. In this manner, a user of device 10 may view a clear image produced by display 14. However, this is merely illustrative. In some embodiments, the supplemental lens 66 may be a removable lens with zero power correction (e.g., no prescription) to protect components within the lens barrel 32.
In some embodiments, it may be desirable to mount the supplemental lens 66 in the lens frame 72 while minimizing stress and/or strain imposed on the supplemental lens 66. Thus, a multi-component lens frame may be used to form the supplemental lens assembly 70. An illustrative example is shown in fig. 5.
As shown in fig. 5, the supplemental lens assembly 70 (also referred to herein as lens assembly 70) may include two frame portions 72A and 72B. As an example, the frame portion 72A may be a main frame portion, and the frame portion 72B may be an auxiliary frame portion. Frame portions 72A and 72B may be formed of metal, polymer, ceramic, and/or other suitable materials.
The lens 66 may be mounted between the frame portions 72A and 72B. Specifically, the lens 66 may first be aligned with/mounted to the frame portion 72A. For example, the lens 66 may have a flange 80 extending from a central portion of the lens 66 and coupled to the frame portion 72A.
Adhesive 74 may then be dispensed on frame portion 72A. The adhesive 74 may be, for example, an acrylic adhesive, a hot melt adhesive, and/or other suitable adhesive. Frame portion 72B may then be coupled to frame portion 72A over lens 66 and adhesive 74. Specifically, adhesive 74 may attach frame portion 72B to frame portion 72A, which in turn may fixedly mount lens 66 between frame portions 72A and 72B. In some embodiments, adhesive 74 may also adhere lens 66 directly to frame portion 72A and/or frame portion 72B.
The frame portion 72B may have mating features 76 and the lens 66 may have mating features 78 on a flange 80. The mating features 76 and 78 may be opposing features, such as opposing protrusions and notches/recesses. For example, in the illustrative example of fig. 5, the mating features 76 may be protrusions and the mating features 78 may be recesses. Thus, when the frame portion 72B is attached to the frame portion 72A using the adhesive 74, the frame portion 72B may be aligned with the lens 66.
A side view of an assembled exemplary lens assembly 70 is shown in fig. 6. As shown in fig. 6, flange 80 of lens 66 may be coupled/mounted to projection 69 of frame portion 72A. Adhesive 74 may then fill the channels 71 of the frame portion 72A. After dispensing adhesive 74, frame portion 72B may be coupled to frame portion 72A, and adhesive 74 may attach frame portion 72B to frame portion 72A. Additionally, in the illustrative example of fig. 6, adhesive 74 may attach peripheral edge 67 of flange 80 of lens 66 to frame portion 72A and frame portion 72B. However, in some embodiments, adhesive 74 may contact only frame portions 72A and 72B and not lens 66.
By mounting the lens 66 between the frame portions 72A and 72B, little or no stress/strain may be applied to the lens 66, although the lens 66 is mounted between the frame portions 72A and 72B. For example, lens 66 may have a strain of, for example, less than 0.5%, less than 0.1%, or 0% when mounted between frame portions 72A and 72B. In this manner, the lens 66 may be mounted between the frame portions 72A and 72B to form the lens assembly 70 with little or no stress/strain imposed on the lens 66.
In addition to applying less stress to the lens 66, less stress may be applied to the frame portions 72A and 72B when the lens 66 is installed. Accordingly, one or more electronic components, such as electronic component 65, may be embedded in frame portion 72A. The electronic component 65 may be a chip (e.g., a Radio Frequency Identification (RFID) chip), a coil for wireless communication (e.g., near Field Communication (NFC)), or another suitable electronic component. In other words, wireless communication circuitry (which may include NFC circuitry or other suitable wireless communication circuitry) may be embedded in the frame portion 72A.
Although fig. 6 shows the frame portion 72A including the electronic component 65, this is merely illustrative. In some embodiments, one or more electronic components may be incorporated into frame portion 72A and/or frame portion 72B.
Further, although fig. 6 shows adhesive 74 coupling frame portion 72A to frame portion 72B, this is merely illustrative. In general, the frame portions 72A and 72B may be attached to each other using any suitable attachment means. As illustrative examples, frame portions 72A and 72B may be attached and/or adhered to frame portion 72B using laser welding, ultrasonic welding, heat staking.
In some implementations, the multi-component lens frame may be formed without using an adhesive to couple the frame portions. An illustrative example is shown in fig. 7.
As shown in fig. 7, a lens assembly 82 that may be used in place of the lens assembly 70 of fig. 4 may include two frame portions 84A and 84B. As an example, the frame portion 84A may be a main frame portion and the frame portion 84B may be an auxiliary frame portion. The frame portions 84A and 84B may be formed of metal, polymer, ceramic, and/or other suitable materials.
A lens 86 (which may correspond to lens 66 of fig. 4) may be mounted between frame portions 84A and 84B. Specifically, the lens 86 may be first aligned with the frame portion 84A. The lens 86 may have a flange extending from a central portion of the lens 86 and coupled to/aligned with the frame portion 84A. Additionally or alternatively, the frame portion 84A may include a polymer ring 88 on the inner surface 89, which may be silicone, an elastomer, and/or another compliant material. When the lens 86 is coupled to/aligned with the frame portion 84A, the lens 86 may be in contact with the polymer ring 88. However, the inclusion of the polymer ring 88 is merely illustrative. In some embodiments, one or more polymeric portions (or portions of another compliant material) may be bonded to the inner surface 89.
After lens 86 is coupled to frame portion 84A/aligned with frame portion 84A, frame portion 84B may be coupled to frame portion 84A over lens 86. Specifically, frame portion 84B may be snapped onto frame portion 84A, which in turn may mount lens 86 between frame portions 84A and 84B.
Although not shown in fig. 7, frame portion 84A, frame portion 84B, and/or lens 86 may have mating features (e.g., mating features 76 and 78 shown in fig. 5). Thus, when frame portion 84B is attached to frame portion 84A, frame portion 84B and/or frame portion 84A may be aligned with lens 86.
A side view of an assembled exemplary lens assembly 82 is shown in fig. 8. As shown in fig. 8, the flange 90 of the lens 86 may be in contact with the polymer ring 88 of the frame portion 84A. After the lens 86 is aligned with the polymer ring 88 and the frame portion 84A, the frame portion 84B may snap into a recess 92 in the frame portion 84A. In this manner, after the frame portions 84A and 84B are snapped together, the lens 86 may be installed between the polymer ring 88 of the frame portion 84A and the frame portion 84B.
By mounting the lens 86 between the frame portions 84A and 84B with the polymer ring 88, little or no stress/strain may be applied to the lens 86, although the lens 86 is mounted between the frame portions 84A and 84B. For example, as an example, the lens 86 may have a strain of less than 2%, less than 1%, less than 0.9%, or less than 0.6%. In this manner, the lens 86 may be mounted between the frame portions 84A and 84B to form the lens assembly 82 with little or no stress/strain imposed on the lens 86.
In addition to applying less stress to the lens 86, less stress may be applied to the frame portions 84A and 84B when the lens 86 is installed. Accordingly, one or more electronic components, such as electronic component 65, may be embedded in frame portion 84A. The electronic component 65 may be a chip (e.g., a Radio Frequency Identification (RFID) chip), a coil for wireless communication (e.g., near Field Communication (NFC)), or another suitable electronic component. In other words, wireless communication circuitry (which may include NFC circuitry or other suitable wireless communication circuitry) may be embedded in the frame portion 84A.
Although fig. 8 shows the frame portion 84A including the electronic component 65, this is merely illustrative. In some embodiments, one or more electronic components may be incorporated into frame portion 84A and/or frame portion 84B.
Although fig. 7 and 8 show a multi-component lens frame snapped together without adhesive, this is merely illustrative. If desired, an adhesive may be applied between frame portions 84A and 84B and/or lens 86. Alternatively or additionally, the frame portions 84A and 84B may be attached and/or adhered to the frame portion 84B using laser welding, ultrasonic welding, heat staking.
In the illustrative examples of fig. 5-8, a multi-component lens frame is formed using two lens frame portions. However, this is merely illustrative. In general, any suitable number of lens frame portions may be used to form a multi-component lens frame. An illustrative example of a multi-component lens frame comprising three lens frame portions is shown in fig. 9.
As shown in fig. 9, a lens assembly 94 that may be used in place of the lens assembly 70 of fig. 4 may include three frame portions 98A, 98B, and 98C. As an example, the frame portion 98A may be a main frame portion, the frame portion 98B may be an auxiliary frame portion, and the frame portion 98C may be an inner ring frame portion. The frame portions 98A, 98B, and 98C may be formed of metal, polymer, ceramic, and/or other suitable materials.
A lens 96 (which may correspond to lens 66 of fig. 4) may be mounted between frame portions 98A, 98B, and 98C. Specifically, frame portion 98C may be first aligned with frame portion 98A and/or attached to frame portion 98A (e.g., frame portion 98C may be adhesively attached to frame portion 98A, may snap-fit with frame portion 98A, or otherwise be attached to frame portion 98A). Lens 96 may then be aligned with frame portions 98A and 98C. Lens 96 may have a flange extending from a central portion of lens 96 and coupled to/aligned with frame portions 98A and 98C. Additionally or alternatively, the frame portion 98C may include one or more polymer portions 106, which may be silicone, elastomer, and/or another compliant material. When lens 96 is coupled to/aligned with frame portion 98C, portions of lens 96 may be in contact with polymer portion 106.
When lens 96 is coupled to frame portions 98A and 98C/aligned with frame portions 98A and 98C, lens 96 may contact frame portion 98C. In some exemplary embodiments, the frame portion 98C may be formed from one or more polymers, such as polybutylene terephthalate (PBT) and/or silicone.
After lens 96 is coupled to frame portions 98A and 98C, frame portion 98B may be coupled to frame portions 98A and 98C over lens 96. Specifically, frame portion 98B may be snapped to frame portion 98A and/or frame portion 98C, which in turn may mount lens 96 between frame portion 98A/98C (e.g., polymer portion 106) and frame portion 98B.
Although not shown in fig. 9, frame portion 98A, frame portion 98B, frame portion 98C, and/or lens 96 may have mating features (e.g., mating features 76 and 78 shown in fig. 5). Thus, when frame portion 98B is attached to frame portions 98A and 98C, frame portion 98B, frame portion 98C, and/or frame portion 98A may be aligned with lens 96.
A side view of an assembled exemplary lens assembly 94 is shown in fig. 10. As shown in fig. 10, the frame portion 98C may first be mounted together/aligned with the frame portion 98A. Flange 108 of lens 96 may then be mounted to frame portion 98C. After lens 96 is aligned with frame portion 98C, extension 104 of frame portion 98B may snap into recess 105 of frame portion 98A. In this manner, lens 96 may be mounted between frame portions 98A, 98B, and 98C.
By mounting lens 96 between frame portions 98A, 98B, and 98C, little or no stress/strain may be applied to lens 96, although lens 96 is mounted between frame portions 98A, 98B, and 98C. For example, the lens 96 may have a strain of 4.5% or less, less than 0.5%, less than 0.2% or less, or 0%. In this manner, lens 96 may be mounted between frame portions 98A, 98B, and 98C to form lens assembly 94 with little or no stress/strain applied to lens 96.
In addition to applying less stress to the lens 96, less stress may be applied to the frame portions 98A, 98B, and 98C when the lens 96 is installed. Accordingly, one or more electronic components, such as electronic component 65, may be embedded in frame portion 98A. The electronic component 65 may be a chip (e.g., a Radio Frequency Identification (RFID) chip), a coil for wireless communication (e.g., near Field Communication (NFC)), or another suitable electronic component. In other words, wireless communication circuitry (which may include NFC circuitry or other suitable wireless communication circuitry) may be embedded in the frame portion 98A.
Although fig. 10 shows the frame portion 98A including the electronic component 65, this is merely illustrative. In some embodiments, one or more electronic components may be incorporated into frame portion 98A, frame portion 98B, and/or frame portion 98C.
Although fig. 9 and 10 show a multi-component lens frame snapped together without adhesive, this is merely illustrative. If desired, an adhesive may be applied between frame portions 98A, 98B and/or 98C and/or lens 96. Alternatively or additionally, the frame portions 98A, 98B and/or 98C may be attached and/or adhered to the frame portions 98A, 98B and/or 98C using laser welding, ultrasonic welding, heat staking.
Instead of or in addition to incorporating the electronic components 65 in the frame portions 98A, 98B and/or 98C, one or more electronic components may be incorporated between the frame portions 98A, 98B and/or 98C. An illustrative example is shown in fig. 11.
As shown in fig. 11, a Near Field Communication (NFC) chip may be incorporated between the frame portions 98A and 98C. Specifically, a Printed Circuit Board (PCB) 116 may be formed on the frame portion 98C between the frame portions 98A and 98C. PCB 116 may be adhesively attached or otherwise attached to frame portion 98C. NFC chip 114 may be attached to PCB 116.
NFC chip 114 may be coupled to NFC coil 110, and NFC coil 110 may transmit and receive NFC signals. NFC coil 110 may be formed on an optional stiffener 112 (which may be a metal layer or another suitable rigid material layer), or NFC coil 110 may be formed directly on frame portion 98C.
Whether NFC coil 110 is formed on stiffener 112 or directly on frame portion 98C, NFC coil 110 may be wrapped around frame portion 98C. As shown in the illustrative example of fig. 12, NFC coil 110 may be coupled to NFC chip 114 and may be wrapped around peripheral edge 118 of frame portion 98C. For example, NFC coil 110 may be wound around frame portion 98C at least once, at least twice, at least four times, or any other suitable number of times.
By incorporating NFC chip 114 and NFC coil 110 into frame portion 98C, NFC functionality may be incorporated into lens assembly 94. For example, NFC chip 114 may send information about lens assembly 94, such as a prescription for lens 96 and/or other suitable information, to electronic device 10 via NFC coil 110.
Although fig. 11 and 12 show a single NFC coil 110 on the frame portion 98C, this is merely illustrative. In general, any suitable number of NFC coils may be wound around the frame portion 98C. For example, at least two NFC coils, at least four NFC coils, or at least six NFC coils may be wound around NFC coil 110.
Furthermore, although the chip 114 and the coil 110 have been described as an NFC chip and an NFC coil, respectively, this is merely illustrative. In general, one or more chips, coils, and/or other components of the wireless communication circuit may be mounted on/coupled to the frame portion 98C.
As described above, one aspect of the present technology is to collect and use information, such as information from an input-output device. The present disclosure contemplates that in some cases, data may be collected that includes personal information data that uniquely identifies or can be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, telephone numbers, email addresses, tweets IDs, home addresses, data or records related to the user's health or fitness level (e.g., vital sign measurements, medication information, exercise information), birth date, user name, password, biometric information, or any other identification or personal information.
The present disclosure recognizes that the use of such personal information in the present technology may be used to benefit users. For example, the personal information data may be used to deliver targeted content of greater interest to the user. Thus, the use of such personal information data enables a user to have programmatic control over the delivered content. In addition, the present disclosure contemplates other uses for personal information data that are beneficial to the user. For example, the health and fitness data may be used to provide insight into the general health of the user, or may be used as positive feedback to individuals who use the technology to pursue health goals.
The present disclosure contemplates that entities responsible for the collection, analysis, disclosure, delivery, storage, or other use of such personal information data will adhere to sophisticated privacy policies and/or privacy measures. In particular, such entities should exercise and adhere to the use of privacy policies and measures that are recognized as meeting or exceeding industry or government requirements for maintaining the privacy and security of personal information data. Such policies should be convenient for the user to access and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable physical uses and must not be shared or sold outside of these legitimate uses. In addition, such collection/sharing should be performed after receiving the user's informed consent. Additionally, such entities should consider taking any necessary steps for protecting and securing access to such personal information data and ensuring that other entities having access to the personal information data adhere to the privacy policies and procedures of other entities. Moreover, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and privacy practices. In addition, policies and practices should be adapted to the particular type of personal information data collected and/or accessed, and to applicable laws and standards including consideration of particular jurisdictions. For example, in the united states, the collection or access to certain health data may be governed by federal and/or state law, such as the health insurance and liability act (Insurance Portability and Accountability Act, HIPAA), while health data in other countries may be subject to other regulations and policies and should be processed accordingly. Thus, different privacy measures should be claimed for different personal data types in each country.
Regardless of the foregoing, the present disclosure also contemplates embodiments in which a user selectively blocks use or access to personal information data. That is, the present disclosure contemplates hardware elements and/or software elements to prevent or block access to such personal information data. For example, the present technology may be configured to allow a user to choose to participate in the collection of personal information data "opt-in" or "opt-out" during or at any time after the registration service. In another example, the user may choose not to provide certain types of user data. In yet another example, the user may choose to limit the length of time that user-specific data is maintained. In addition to providing the "opt-in" and "opt-out" options, the present disclosure contemplates providing notifications related to accessing or using personal information. For example, a user may be notified that his personal information data will be accessed when an application ("app") is downloaded, and then be reminded again before the personal information data is accessed by the app.
Furthermore, it is intended that personal information data should be managed and processed in a manner that minimizes the risk of inadvertent or unauthorized access or use. Once the data is no longer needed, risk can be minimized by limiting the collection and deletion of data. Further, and when applicable, including in certain health-related applications, data de-identification may be used to protect the privacy of the user. De-identification may be facilitated by removing a particular identifier (e.g., date of birth, etc.), controlling the amount or characteristics of data stored (e.g., collecting location data at a city level rather than an address level), controlling the manner in which data is stored (e.g., aggregating data between users), and/or other methods, where appropriate.
Thus, while the present disclosure broadly contemplates the use of information that may include personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that the various embodiments may be implemented without requiring access to the personal information data. That is, various embodiments of the present technology do not become inoperable due to the lack of all or a portion of such personal information data.
Physical environment-a physical environment refers to the physical world in which people can sense and/or interact without the assistance of an electronic system. Physical environments such as physical parks include physical objects such as physical trees, physical buildings, and physical people. People can directly sense and/or interact with a physical environment, such as by visual, tactile, auditory, gustatory, and olfactory.
Computer-generated reality in contrast, a computer-generated reality (CGR) environment refers to a completely or partially simulated environment in which people perceive and/or interact via an electronic system. In the CGR, a subset of the physical movements of the person, or a representation thereof, is tracked, and in response, one or more characteristics of one or more virtual objects simulated in the CGR environment are adjusted in a manner consistent with at least one physical law. For example, the CGR system may detect a person's head rotation and, in response, adjust the graphical content and sound field presented to the person in a manner similar to the manner in which such views and sounds change in a physical environment. In some cases (e.g., for reachability reasons), the adjustment of the characteristics of the virtual object in the CGR environment may be made in response to a representation of physical motion (e.g., a voice command). A person may utilize any of his sensations to sense and/or interact with CGR objects, including visual, auditory, tactile, gustatory, and olfactory. For example, a person may sense and/or interact with audio objects that create a 3D or spatial audio environment that provides a perception of a point audio source in 3D space. As another example, an audio object may enable audio transparency that selectively introduces environmental sounds from a physical environment with or without computer generated audio. In some CGR environments, a person may sense and/or interact with only audio objects. Examples of CGR include virtual reality and mixed reality.
Virtual reality-Virtual Reality (VR) environment refers to a simulated environment designed to be based entirely on computer-generated sensory input for one or more senses. The VR environment includes a plurality of virtual objects that a person can sense and/or interact with. For example, computer-generated images of trees, buildings, and avatars representing people are examples of virtual objects. A person may sense and/or interact with virtual objects in a VR environment through a simulation of the presence of the person within the computer-generated environment and/or through a simulation of a subset of the physical movements of the person within the computer-generated environment.
Mixed reality-in contrast to VR environments that are designed to be based entirely on computer-generated sensory input, mixed Reality (MR) environments refer to simulated environments that are designed to introduce sensory input, or representations thereof, from a physical environment in addition to including computer-generated sensory input (e.g., virtual objects). On a virtual continuum, a mixed reality environment is any condition between, but not including, a full physical environment as one end and a virtual reality environment as the other end. In some MR environments, the computer-generated sensory input may be responsive to changes in sensory input from the physical environment. In addition, some electronic systems for rendering MR environments may track the position and/or orientation relative to the physical environment to enable virtual objects to interact with real objects (i.e., physical objects or representations thereof from the physical environment). For example, the system may cause movement such that the virtual tree appears to be stationary relative to the physical ground. Examples of mixed reality include augmented reality and augmented virtualization. Augmented Reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed on a physical environment or a representation of a physical environment. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment. The system may be configured to present the virtual object on a transparent or semi-transparent display such that a person perceives the virtual object as overlapping over the physical environment with the system. Alternatively, the system may have an opaque display and one or more imaging sensors that capture images or videos of the physical environment, which are representations of the physical environment. The system combines the image or video with the virtual object and presents the composition on an opaque display. A person utilizes the system to indirectly view the physical environment via an image or video of the physical environment and perceive a virtual object that is superimposed over the physical environment. As used herein, video of a physical environment displayed on an opaque display is referred to as "pass-through video," meaning that the system captures images of the physical environment using one or more image sensors and uses those images when rendering an AR environment on the opaque display. Further alternatively, the system may have a projection system that projects the virtual object into the physical environment, for example as a hologram or on a physical surface, such that a person perceives the virtual object as overlapping the physical environment with the system. An augmented reality environment also refers to a simulated environment in which a representation of a physical environment is transformed by computer-generated sensory information. For example, in providing a passthrough video, the system may transform one or more sensor images to apply a selected viewing angle (e.g., a viewpoint) that is different from the viewing angle captured by the imaging sensor. As another example, the representation of the physical environment may be transformed by graphically modifying (e.g., magnifying) portions thereof such that the modified portions may be representative but not real versions of the original captured image. For another example, the representation of the physical environment may be transformed by graphically eliminating or blurring portions thereof. Enhanced virtual-enhanced virtual (AV) environments refer to simulated environments in which a virtual environment or computer-generated environment incorporates one or more sensory inputs from a physical environment. The sensory input may be a representation of one or more characteristics of the physical environment. For example, an AV park may have virtual trees and virtual buildings, but the face of a person is realistically reproduced from an image taken of a physical person. As another example, the virtual object may take the shape or color of a physical object imaged by one or more imaging sensors. For another example, the virtual object may employ shadows that conform to the positioning of the sun in the physical environment.
Hardware there are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Examples include head-mounted systems, projection-based systems, head-up displays (HUDs), vehicle windshields integrated with display capabilities, windows integrated with display capabilities, displays formed as lenses designed for placement on a human eye (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smart phones, tablet computers, and desktop/laptop computers. The head-mounted system may have one or more speakers and an integrated opaque display. Alternatively, the head-mounted system may be configured to accept an external opaque display (e.g., a smart phone). The head-mounted system may incorporate one or more imaging sensors for capturing images or video of the physical environment, and/or one or more microphones for capturing audio of the physical environment. The head-mounted system may have a transparent or translucent display instead of an opaque display. The transparent or translucent display may have a medium through which light representing an image is directed to the eyes of a person. The display may utilize digital light projection, OLED, LED, μled, liquid crystal on silicon, laser scanning light source, or any combination of these techniques. The medium may be an optical waveguide, a holographic medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, the transparent or translucent display may be configured to selectively become opaque. Projection-based systems may employ retinal projection techniques that project a graphical image onto a person's retina. The projection system may also be configured to project the virtual object into the physical environment, for example as a hologram or on a physical surface.
According to one embodiment, a lens assembly includes a first lens frame portion, a second lens frame portion adhesively attached to the first lens frame portion, and a lens mounted to the first and second lens frame portions, wherein the lens includes a flange interposed between the first and second lens frame portions.
According to another embodiment, the first lens frame portion optionally comprises a channel and the lens assembly comprises an adhesive in the channel, the adhesive attaching the second lens frame portion to the first lens frame portion.
According to another embodiment, the lens optionally comprises a peripheral edge and the adhesive is attached to the peripheral edges of the first lens frame portion, the second lens frame portion and the lens.
According to another embodiment, the second lens frame portion optionally includes a first mating feature and the lens includes a second mating feature on the flange that aligns the second lens frame portion with the lens.
According to another embodiment, the first mating feature optionally comprises a protrusion and the second mating feature comprises a recess.
According to another embodiment, the adhesive optionally comprises an acrylic adhesive.
According to another embodiment, the lens assembly optionally further comprises wireless communication circuitry embedded in the first lens frame portion.
According to another embodiment, the lens optionally has a strain of less than 0.1% between the first and second lens frame portions.
According to another embodiment, a head-mountable device includes a housing, an optical module supported by the housing and configured to provide an image to an eyebox, wherein the optical module lens, and a supplemental lens assembly configured to be attached to the optical module. The supplemental lens assembly includes a supplemental lens including a flange and a frame including a first frame portion attached to a second frame portion, wherein the supplemental lens is mounted in the frame with the flange interposed between the first and second frame portions.
According to another embodiment, the supplemental lens optionally further comprises a peripheral edge of the flange, and the supplemental lens assembly further comprises an adhesive bonding the first frame portion, the second frame portion, and the peripheral edge.
According to another embodiment, the first frame portion optionally comprises an inner surface and a polymer ring on the inner surface, and the supplemental lens is interposed between the polymer ring and the second frame portion.
According to another embodiment, the first frame part optionally further comprises a recess and the second frame part snaps into the recess to attach the second frame part to the first frame part.
According to another embodiment, the supplemental lens optionally has a strain of less than 2% between the first and second frame portions.
According to another embodiment, the frame optionally further comprises a third frame portion interposed between the first and second frame portions, the third frame portion being coupled to the first frame portion, and the flange of the supplemental lens being interposed between the third and second frame portions.
According to another embodiment, the third frame portion optionally comprises a polymer portion, and the flange of the supplemental lens is interposed between the polymer portion and the second frame portion.
According to another embodiment, the first frame portion optionally comprises a recess, the second frame portion optionally comprises an extension, and the extension of the second frame portion snaps into the recess to attach the second frame portion to the first frame portion.
According to another embodiment, the supplemental lens assembly optionally further comprises a wireless communication chip mounted on the third frame portion and a coil wound around the third frame portion and coupled to the wireless communication chip.
According to another embodiment, the second frame portion optionally comprises a protrusion and the supplemental lens comprises a recess on the flange, the recess aligning the second frame portion with the supplemental lens.
According to one embodiment, a lens assembly includes a first lens frame portion including a recess, a second lens frame portion including an extension, wherein the extension snaps into the recess to attach the second lens frame portion to the first lens frame portion, a third lens frame portion interposed between the first lens frame portion and the second lens frame portion, wherein the third lens frame portion includes a polymer portion, and a lens mounted to the first lens frame portion, the second lens frame portion, and the third lens frame portion, wherein the lens includes a flange interposed between the polymer portion of the third lens frame portion and the second lens frame portion.
According to another embodiment, the lens assembly optionally further comprises a wireless communication chip mounted on the third lens frame portion and a coil wound around the third lens frame portion and coupled to the wireless communication chip.
The foregoing is merely illustrative and various modifications may be made to the described embodiments. The foregoing embodiments may be implemented alone or in any combination.

Claims (20)

1.一种透镜组件,所述透镜组件包括:1. A lens assembly, the lens assembly comprising: 第一透镜框架部分;First lens frame section; 第二透镜框架部分,所述第二透镜框架部分粘合地附接到所述第一透镜框架部分;以及The second lens frame portion is adhesively attached to the first lens frame portion; and 透镜,所述透镜安装到所述第一透镜框架部分和所述第二透镜框架部分,其中所述透镜包括凸缘,所述凸缘插置在所述第一透镜框架部分和所述第二透镜框架部分之间。A lens, the lens being mounted to a first lens frame portion and a second lens frame portion, wherein the lens includes a flange that is inserted between the first lens frame portion and the second lens frame portion. 2.根据权利要求1所述的透镜组件,其中所述第一透镜框架部分包括通道,并且所述透镜组件还包括:2. The lens assembly of claim 1, wherein the first lens frame portion includes a channel, and the lens assembly further includes: 位于所述通道中的粘合剂,所述粘合剂将所述第二透镜框架部分附接到所述第一透镜框架部分。An adhesive located in the channel, the adhesive attaching the second lens frame portion to the first lens frame portion. 3.根据权利要求2所述的透镜组件,其中所述透镜包括周边边缘,并且所述粘合剂附接到所述第一透镜框架部分、所述第二透镜框架部分和所述透镜的所述周边边缘。3. The lens assembly of claim 2, wherein the lens includes a peripheral edge, and the adhesive is attached to the first lens frame portion, the second lens frame portion, and the peripheral edge of the lens. 4.根据权利要求3所述的透镜组件,其中所述第二透镜框架部分包括第一配合特征部,并且所述透镜包括位于所述凸缘上的第二配合特征部,所述第二配合特征部将所述第二透镜框架部分与所述透镜对准。4. The lens assembly of claim 3, wherein the second lens frame portion includes a first mating feature, and the lens includes a second mating feature located on the flange, the second mating feature aligning the second lens frame portion with the lens. 5.根据权利要求4所述的透镜组件,其中所述第一配合特征部包括突起,并且所述第二配合特征部包括凹口。5. The lens assembly of claim 4, wherein the first mating feature includes a protrusion, and the second mating feature includes a notch. 6.根据权利要求5所述的透镜组件,其中所述粘合剂包括丙烯酸系粘合剂。6. The lens assembly of claim 5, wherein the adhesive comprises an acrylic adhesive. 7.根据权利要求1所述的透镜组件,还包括:7. The lens assembly according to claim 1, further comprising: 无线通信电路,所述无线通信电路嵌入在所述第一透镜框架部分中。A wireless communication circuit is embedded in the first lens frame portion. 8.根据权利要求1所述的透镜组件,其中所述透镜在所述第一透镜框架部分与所述第二透镜框架部分之间具有小于0.1%的应变。8. The lens assembly of claim 1, wherein the lens has a strain of less than 0.1% between the first lens frame portion and the second lens frame portion. 9.一种头戴式设备,所述头戴式设备包括:9. A head-mounted device, the head-mounted device comprising: 外壳;以及The outer casing; and 光学模块,所述光学模块由所述外壳支撑并且被配置为向适眼区提供图像,其中所述光学模块包括透镜;以及An optical module, supported by the housing and configured to provide an image to the eye-friendly area, wherein the optical module includes a lens; and 补充透镜组件,所述补充透镜组件被配置为附接到所述光学模块,其中所述补充透镜组件包括:Supplementary lens assembly, configured to be attached to the optical module, wherein the supplementary lens assembly includes: 补充透镜,所述补充透镜包括凸缘;以及A supplementary lens, the supplementary lens including a flange; and 框架,所述框架包括附接到第二框架部分的第一框架部分,The frame includes a first frame portion attached to the second frame portion. 其中所述补充透镜以所述凸缘插置在所述第一框架部分和所述第二框架部分之间的方式安装在所述框架中。The supplementary lens is mounted in the frame such that the flange is inserted between the first frame portion and the second frame portion. 10.根据权利要求9所述的头戴式设备,其中所述补充透镜还包括所述凸缘的周边边缘,并且所述补充透镜组件还包括粘合剂,所述粘合剂粘结所述第一框架部分、所述第二框架部分和所述周边边缘。10. The head-mounted device of claim 9, wherein the supplementary lens further includes a peripheral edge of the flange, and the supplementary lens assembly further includes an adhesive that bonds the first frame portion, the second frame portion, and the peripheral edge. 11.根据权利要求9所述的头戴式设备,其中所述第一框架部分包括内表面和位于所述内表面上的聚合物环,并且所述补充透镜插置在所述聚合物环和所述第二框架部分之间。11. The head-mounted device of claim 9, wherein the first frame portion includes an inner surface and a polymer ring located on the inner surface, and the supplementary lens is inserted between the polymer ring and the second frame portion. 12.根据权利要求11所述的头戴式设备,其中所述第一框架部分还包括凹部,并且所述第二框架部分扣合到所述凹部中以将所述第二框架部分附接到所述第一框架部分。12. The head-mounted device of claim 11, wherein the first frame portion further includes a recess, and the second frame portion is engaged into the recess to attach the second frame portion to the first frame portion. 13.根据权利要求12所述的头戴式设备,其中所述补充透镜在所述第一框架部分和所述第二框架部分之间具有小于2%的应变。13. The head-mounted device of claim 12, wherein the supplementary lens has a strain of less than 2% between the first frame portion and the second frame portion. 14.根据权利要求9所述的头戴式设备,其中所述框架还包括:14. The head-mounted device of claim 9, wherein the frame further comprises: 插置在所述第一框架部分和所述第二框架部分之间的第三框架部分,所述第三框架部分联接到所述第一框架部分,并且所述补充透镜的所述凸缘插置在所述第三框架部分和所述第二框架部分之间。A third frame portion is inserted between the first frame portion and the second frame portion, the third frame portion being connected to the first frame portion, and the flange of the supplementary lens is inserted between the third frame portion and the second frame portion. 15.根据权利要求14所述的头戴式设备,其中所述第三框架部分包括聚合物部分,并且所述补充透镜的所述凸缘插置在所述聚合物部分和所述第二框架部分之间。15. The head-mounted device of claim 14, wherein the third frame portion comprises a polymer portion, and the flange of the supplementary lens is inserted between the polymer portion and the second frame portion. 16.根据权利要求15所述的头戴式设备,其中所述第一框架部分包括凹部,所述第二框架部分包括延伸部,并且所述第二框架部分的所述延伸部扣合到所述凹部中以将所述第二框架部分附接到所述第一框架部分。16. The head-mounted device of claim 15, wherein the first frame portion includes a recess, the second frame portion includes an extension, and the extension of the second frame portion engages with the recess to attach the second frame portion to the first frame portion. 17.根据权利要求16所述的头戴式设备,其中所述补充透镜组件还包括安装在所述第三框架部分上的无线通信芯片和围绕所述第三框架部分卷绕并耦合到所述无线通信芯片的线圈。17. The head-mounted device of claim 16, wherein the supplementary lens assembly further comprises a wireless communication chip mounted on the third frame portion and a coil wound around the third frame portion and coupled to the wireless communication chip. 18.根据权利要求9所述的头戴式设备,其中所述第二框架部分包括突起,并且所述补充透镜包括位于所述凸缘上的凹口,所述凹口将所述第二框架部分与所述补充透镜对准。18. The head-mounted device of claim 9, wherein the second frame portion includes a protrusion, and the supplementary lens includes a notch located on the flange, the notch aligning the second frame portion with the supplementary lens. 19.一种透镜组件,所述透镜组件包括:19. A lens assembly, the lens assembly comprising: 第一透镜框架部分,所述第一透镜框架部分包括凹部;The first lens frame portion includes a recess; 第二透镜框架部分,所述第二透镜框架部分包括延伸部,其中所述延伸部扣合到所述凹部中以将所述第二透镜框架部分附接到所述第一透镜框架部分;The second lens frame portion includes an extension, wherein the extension engages with the recess to attach the second lens frame portion to the first lens frame portion. 第三透镜框架部分,所述第三透镜框架部分插置在所述第一透镜框架部分和所述第二透镜框架部分之间,其中所述第三透镜框架部分包括聚合物部分;以及A third lens frame portion, said third lens frame portion being inserted between the first lens frame portion and the second lens frame portion, wherein said third lens frame portion includes a polymer portion; and 安装到所述第一透镜框架部分、所述第二透镜框架部分和所述第三透镜框架部分的透镜,其中所述透镜包括凸缘,所述凸缘插置在所述第三透镜框架部分的所述聚合物部分和所述第二透镜框架部分之间。Lenses mounted to the first lens frame portion, the second lens frame portion, and the third lens frame portion, wherein the lenses include a flange that is inserted between the polymer portion of the third lens frame portion and the second lens frame portion. 20.根据权利要求19所述的透镜组件,还包括:20. The lens assembly of claim 19, further comprising: 无线通信芯片,所述无线通信芯片安装在第三透镜框架部分上;以及A wireless communication chip, said wireless communication chip being mounted on the third lens frame portion; and 线圈,所述线圈围绕所述第三透镜框架部分卷绕并耦合到所述无线通信芯片。A coil, which is wound around the third lens frame portion and coupled to the wireless communication chip.
CN202510914848.3A 2024-07-09 2025-07-03 Multi-part lens frame Pending CN121299871A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202463669064P 2024-07-09 2024-07-09
US63/669,064 2024-07-09
US202519215197A 2025-05-21 2025-05-21
US19/215,197 2025-05-21

Publications (1)

Publication Number Publication Date
CN121299871A true CN121299871A (en) 2026-01-09

Family

ID=98284688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510914848.3A Pending CN121299871A (en) 2024-07-09 2025-07-03 Multi-part lens frame

Country Status (1)

Country Link
CN (1) CN121299871A (en)

Similar Documents

Publication Publication Date Title
CN113661431B (en) Optical module of head-mounted device
US20230418019A1 (en) Electronic Device With Lens Position Sensing
US11119321B2 (en) Electronic device with a display attached to a lens element
EP3910404B1 (en) Electronic devices with optical modules
US20240027778A1 (en) Head-Mounted Electronic Device
CN112540670A (en) Electronic device with finger sensor
CN112578526B (en) Lens mounting system for electronic device
KR102676300B1 (en) Lens mounting structures for head-mounted devices
CN209821509U (en) Head-mounted system
US12140767B2 (en) Head-mounted device with optical module illumination systems
US11899214B1 (en) Head-mounted device with virtually shifted component locations using a double-folded light path
US12271002B1 (en) Head-mounted display and display modules thereof
CN121299871A (en) Multi-part lens frame
US11762422B1 (en) Electronic devices with drop protection
CN120065529A (en) Electronic device with movable part
CN117471691A (en) Electronic devices with light-blocking coverings
CN118426178A (en) System with blanket seal
CN117389040A (en) Electronic devices with lens positioners
WO2024044556A1 (en) Head-mounted electronic device with magnification tool
CN119065121A (en) Lens coating

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination