This patent application claims priority from U.S. patent application Ser. No. 18/336,703, filed on 6 months 16, 2022, U.S. provisional patent application Ser. No. 63/355,990, filed on 27, 2022, 12 months 9, and U.S. provisional patent application Ser. No. 63/431,514, which are incorporated herein by reference in their entirety.
Detailed Description
The electronic device may include a display and other components for presenting content to a user. The electronic device may be a wearable electronic device. A wearable electronic device, such as a head-mounted device, may have a head-mounted support structure that allows the head-mounted device to be worn on a user's head.
The head-mounted device may include a display formed of one or more display panels (displays) for displaying visual content to a user. A lens system may be used to allow a user to focus on the display and view visual content. The lens system may have a left lens aligned with the left eye of the user and a right lens aligned with the right eye of the user.
Not all users have eyes separated by the same interpupillary distance. To ensure that a wide range of users can comfortably view content on the display, the head mounted device may be provided with a lens locator. The lens positioner may be used to adjust the lens-to-lens spacing between the left lens and the right lens to match the user's inter-pupillary distance.
To prevent excessive pressure on the user's nose surface, a force sensor may be used to determine how much pressure the lens applies to the user's nose as the lens-to-lens spacing varies. Control circuitry in the head-mounted device may adjust the left and right lenses to match the user's inter-pupillary distance unless the lenses apply excessive pressure to the user's nose (e.g., the pressure measured by the force sensor exceeds a threshold). In some cases, the left and right lenses may be spaced apart such that the lens-to-lens spacing between the left and right lenses matches the user's inter-pupillary distance. In other cases, the lens-to-lens spacing between the left and right lenses will be slightly greater than the user's interpupillary spacing to ensure that the lenses are not excessively pressed against the user's nose. A sensor circuit such as a force sensing circuit may be used to provide real-time feedback to the control circuit regarding the pressure applied by the lens to the user's nose, thereby ensuring that the position of the left and right lenses are satisfactorily adjusted.
Fig. 1 shows a schematic diagram of an exemplary system with an electronic device having a sensor circuit that ensures satisfactory placement of a lens relative to facial features of a user. As shown in fig. 1, system 8 may include one or more electronic devices, such as electronic device 10. The electronic devices of system 8 may include computers, cellular telephones, head-mounted devices, wristwatch devices, and other electronic devices. The configuration in which the electronic device 10 is a head mounted device is sometimes described herein as an example.
As shown in fig. 1, an electronic device, such as electronic device 10, may have a control circuit 12. Control circuit 12 may include memory means and processing circuitry for controlling the operation of device 10. The circuit 12 may include a storage device, such as a hard drive storage device, a non-volatile memory (e.g., an electrically programmable read-only memory configured to form a solid state drive), a volatile memory (e.g., static or dynamic random access memory), and so forth. The processing circuitry in control circuit 12 may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio chips, graphics processing units, application specific integrated circuits, and other integrated circuits. The software codes may be stored on a memory device in the circuit 12 and run on a processing circuit in the circuit 12 to implement control operations of the apparatus 10 (e.g., data acquisition operations, operations involved in processing three-dimensional facial image data, operations involving adjustment of components using control signals, etc.). The control circuit 12 may include wired and wireless communication circuits. For example, the control circuit 12 may include a radio frequency transceiver circuit, such as a cellular telephone transceiver circuit, a wireless local area networkTransceiver circuitry, millimeter-wave transceiver circuitry, and/or other wireless communication circuitry.
During operation, the communication circuitry of the devices in system 8 (e.g., the communication circuitry of control circuitry 12 of device 10) may be used to support communication between electronic devices. For example, one electronic device may transmit video and/or audio data to another electronic device in system 8. The electronic devices in system 8 may use wired and/or wireless communication circuitry to communicate over one or more communication networks (e.g., the internet, a local area network, etc.). The communication circuitry may be used to allow the device 10 to receive data from external equipment (e.g., a network-shared computer, a portable device such as a handheld or laptop computer, an online computing device such as a remote server or other remote computing device, or other electrical device) and/or to provide data to external equipment.
The device 10 may include an input-output device 22. The input-output device 22 may be used to allow a user to provide user input to the device 10. The input-output device 22 may also be used to gather information about the environment in which the device 10 is operating. Output components in device 22 may allow device 10 to provide output to a user and may be used to communicate with external electrical equipment.
As shown in FIG. 1, the input-output device 22 may include one or more displays such as display 14. In some configurations, the display 14 of the device 10 includes left and right display panels (sometimes referred to as left and right portions of the display 14 and/or left and right displays) that are aligned with and viewable through the left and right lens assemblies, respectively, of the user. In other configurations, the display 14 includes a single display panel that extends across both eyes.
The display 14 may be used to display images. Visual content displayed on display 14 may be viewable by a user of device 10. The display in device 10, such as display 14, may be an organic light emitting diode display or other display based on an array of light emitting diodes, a liquid crystal display, a liquid crystal on silicon display, a projector, or a display based on projecting a light beam onto a surface directly or indirectly through dedicated optics (e.g., a digital micromirror device), an electrophoretic display, a plasma display, an electrowetting display, a micro LED display, or any other suitable display.
The display 14 may present computer-generated content, such as virtual reality content and mixed reality content, to a user. Virtual reality content may be displayed without real world content. Mixed reality content, which may sometimes be referred to as augmented reality content, may include computer-generated images superimposed over real-world images. The real world image may be captured by a camera (e.g., a front-facing camera) and combined with the superimposed computer-generated content, or an optical coupling system may be used to allow the computer-generated content to be superimposed on the real world image. For example, a pair of mixed reality glasses or other augmented reality head mounted displays may include a display device that provides images to a user through a beam splitter, prism, holographic coupler, or other optical coupler. A configuration in which the display 14 is used to display virtual reality content to a user through a lens is described herein as an example.
The input-output device 22 may include the sensor 16. The sensor 16 may comprise, for example, a three-dimensional sensor (e.g., a three-dimensional image sensor such as a structured light sensor that emits a light beam and uses a two-dimensional digital image sensor to acquire image data for a three-dimensional image from a light spot generated when the light beam illuminates a target; binocular three-dimensional image sensors that use two or more cameras in a binocular imaging arrangement to capture three-dimensional images, three-dimensional lidar (light detection and ranging) sensors, three-dimensional radio frequency sensors, or other sensors that capture three-dimensional image data), cameras (e.g., infrared and/or visible digital image sensors), gaze tracking sensors (e.g., gaze tracking systems based on image sensors and, if desired, light sources that emit one or more light beams, wherein after the light beams are reflected by the eyes of a user, the image sensors are used to track the one or more light beams), touch sensors, buttons, force sensors, sensors such as switch-based touch sensors, gas sensors, pressure sensors, humidity sensors, magnetic sensors, audio sensors (microphones), ambient light sensors, microphones for capturing voice commands and other audio inputs, sensors configured to capture information about motion, position and/or orientation (e.g., accelerometers, gyroscopes, compasses, and/or inertial sensors including a subset of one or two-or more of these sensors), fingerprint sensors, and other recognition units An optical position sensor (optical encoder) and/or other position sensors such as a linear position sensor and/or other sensors. As shown in fig. 1, the sensor 16 may include a sensing circuit (sensor circuit) configured to measure the pressure applied between objects in the system 8. The sensing circuitry may include one or more sensors, such as one or more force sensors 20. A sensing circuit such as force sensor 20 may be used, for example, to sense the amount of pressure applied to the user's nose by a lens assembly in device 10.
User inputs and other information may be collected using sensors and other input devices in the input-output device 22. The input-output device 22 may include other devices 24 such as a haptic output device (e.g., a vibrating component), light emitting diodes and other light sources, speakers for producing audio output such as ear speakers, and other electronic components, if desired. Device 10 may include circuitry for receiving wireless power, circuitry for wirelessly transmitting power to other devices, batteries and other energy storage devices (e.g., capacitors), joysticks, buttons, and/or other components.
The electronic device 10 may have a housing structure (e.g., housing wall, strap, etc.), as shown by the exemplary support structure 26 of fig. 1. In configurations where the electronic device 10 is a head-mounted device (e.g., a pair of glasses, goggles, helmet, hat, headband, etc.), the support structure 26 may include a head-mounted support structure (e.g., a helmet shell, headband, a temple in a pair of glasses, a goggle shell structure, and/or other head-mounted structure). The head-mounted support structure may be configured to be worn on the head of a user during operation of the device 10, and may support the display 14, the sensors 16, other components 24, other input-output devices 22, and the control circuitry 12.
Fig. 2 is a top view of the electronic device 10 in an exemplary configuration in which the electronic device 10 is a head-mounted device. As shown in fig. 2, the electronic device 10 may include a support structure (see, e.g., support structure 26 of fig. 1) that is used in housing components of the device 10 and in donning the device 10 onto a user's head. These support structures may include, for example, structures forming housing walls and other structures (e.g., exterior housing walls, lens assembly structures, etc.) and straps for the main unit 26-2, or other auxiliary support structures, such as structures 26-1 that help hold the main unit 26-2 on the face of the user with the eyes of the user located within the eyebox 60.
The display 14 may include left and right display panels (e.g., left and right pixel arrays, sometimes referred to as left and right displays or left and right display portions) that are mounted in left and right display modules 70 corresponding to the left eye (and left eye-ward region 60) and right eye (and right eye-ward region) of a user, respectively. Positioning circuitry such as respective left and right positioners 58 may be used to individually position the module 70, which may sometimes be referred to as a lens support structure, lens assembly, lens housing, or lens and display housing, relative to the housing wall structure of the main unit 26-2 and relative to the user's eyes. The positioner 58 may include a stepper motor, a piezoelectric actuator, a motor, a linear electromagnetic actuator, and/or other electronic components for adjusting the position of the lens assembly. During operation of the device 10, the positioner 58 may be controlled by the control circuit 12. For example, the locator 58 may be used to adjust the spacing between the modules 70 (and thus the lens-to-lens spacing between the left and right lenses of the modules 70) to match the inter-pupillary distance (IPD) of the user's eyes. This allows the user to view the left and right display portions of the display 14 in the left and right lens modules. However, in some cases, if the IPD to the user is adjusted, the lens may apply excessive pressure to the user's nose. Thus, a sensor may be incorporated into the device 10 to monitor the pressure on the nose of the user. An exemplary arrangement including a force sensor to monitor the pressure applied to the nose of a user is shown in fig. 3.
As shown in fig. 3, the lens assembly 70 (e.g., one of the two lens modules 70 shown in fig. 2) may be adjacent to the user's nose 40 when the device 10 is worn on the user's head. For simplicity, only one of the two lens modules is shown in fig. 3, but the other lens assembly 70 may have the same structure.
To ensure that the user can see the display 14 when the user's eye is in the eyebox 60 (fig. 2), the control circuit 12 may attempt to align the lens center LC with the center PC of the user's eye. At the same time, the control circuit 12 may use a sensor circuit, such as the force sensor 20, to monitor the pressure applied by the lens module 70 to the nose 40 to ensure that the lens module 70 does not over-press the nose 40 and cause discomfort.
In cases where the user's nose is small, there may be sufficient space available to align the lens center LC with the eye center PC. In the case of a large user nose, the control circuit 12 may position the module 70 as shown in fig. 3. For example, the distance between lens modules 70 (lens-to-lens spacing) may be greater than the distance desired for perfect alignment of lens center LC with eye center PC. The use of such a wider lens-to-lens spacing helps ensure that the lens module 70 does not exert a greater inward force on the nose 40 than would be comfortable for the user, while still allowing the user to satisfactorily view content on the display 14 through the lens 72. The lens module 70 may be placed at a non-zero distance (gap) from the side surface of the nose 40 or may be spaced apart from the side surface of the nose 40 by a predetermined gap. The user may select the most comfortable of these options for the user and/or may provide default settings to control circuit 12.
In operation, the positioner 58 may move the lens module 70 (fig. 2) toward the nose 40 in an attempt to align the lens center LC of each lens with the eye center PC. Sensors may be incorporated into the device 10 to ensure that the lens module 70 does not apply excessive pressure to the nose 40. In general, any desired sensor circuit may be used to measure the pressure on nose 40. In one example, each lens module 70 may have one or more force sensors 20.
Force sensor 20 may be incorporated between nasal flap 29 and each lens assembly 70. The nasal flap 29 may be a fabric, polymer, or other material composition that allows the user of the device 10 to comfortably fit. For example, a nasal flap 29 may be interposed between each lens assembly 70 and the nose 40. In some embodiments, the nasal flap 29 may extend along both sides and over the top of the nose 40 (e.g., over at least a portion of the bridge of the nose 40). However, this is merely illustrative. The nasal flap 29 may have two separate portions, one between the left lens assembly 70 and the nose 40 and the other between the right lens assembly 70 and the nose 40, or may be omitted from the device 10 if desired.
In embodiments where a nasal flap 29 is included in the device 10, the force sensor 20 may be incorporated between each lens assembly 70 (i.e., left and right lens assemblies 70, 70) and the nasal flap 29. However, this location of force sensor 20 is merely illustrative. In general, force sensor 20 may be included in any desired location within device 10. For example, force sensor 20 may be formed between nose flap 29 and nose 40 as shown in position 20', may be formed within nose flap 29 or integral with nose flap 29 as shown in position 20", or may be formed within lens assembly 70 as shown in position 20'".
Although fig. 3 shows a single force sensor 20 between the nose 40 and the lens assembly 70, this is merely illustrative. The device 10 may have one force sensor 20 between the nose 40 and each lens assembly 70, may have multiple force sensors between the nose 40 and each lens assembly 70, may have a single force sensor between the nose 40 and one lens assembly 70, or may have any other desired arrangement of force sensors 20.
Regardless of where force sensor 20 is formed within device 10, force sensor 20 may monitor the amount of force applied to nose 40 by lens module 70 to ensure that excessive pressure is not applied to nose 40. The force sensor 20 may continuously monitor the applied force or may flag the control circuit 12 when the applied force exceeds a threshold. Force sensor 20 may be any desired type of sensor that monitors the amount of force/pressure applied to nose 40. Some examples of force sensors 20 are shown in fig. 4A-4C.
In some examples, the force sensor 20 may be a direct force sensor, such as the force sensor 20 of fig. 4A. The direct force sensor 20 of fig. 4A may be a piezoresistor and may include electrodes formed by interdigitated traces 27. In particular, when pressure is applied to the direct force sensor 20, the resistance may be determined due to the change in distance between the portions of the interdigitated traces 27, and circuitry such as the control circuitry 12 may determine the force applied to the direct force sensor 20. If the direct force sensor 20 is placed in any of the possible positions shown in fig. 3, or is otherwise placed between the user's nose and the lens assembly 70, the direct force sensor 20 may be used to determine the force that the lens assembly 70 applies to the nose 40 as the lens assembly 70 moves toward the nose 40. The direct force sensor 20 may be used to continuously measure pressure against the nose 40, or may be used as a threshold sensor (i.e., the control circuit 12 may determine that the resistance of the direct force sensor 20 exceeds a threshold and thus that the force on the nose 40 is too great).
In other examples, force sensor 20 may be formed from conductive strands, yarns, or fibers and incorporated into the fabric in device 10. For example, force sensor 20 may be formed from a smart fabric or other fabric that includes conductive strands. The conductive strands may be arranged to form a force sensor. An example of such an arrangement is shown in fig. 4B.
As shown in fig. 4B, force sensor 20 may be incorporated into fabric 31. In particular, the fabric 31 may include conductive strands 28 and non-conductive strands 33. The conductive strands 28 may be arranged to form a force sensor or a portion of a force sensor (e.g., an electrode of a sensor that indicates when the sensor is in contact with the nose of a user). For example, a control circuit such as control circuit 12 may measure a change in capacitance or resistance between the conductive strands 28. Because capacitance/resistance will vary with the distance between the conductive strands 28, the capacitance/resistance measurement indicates the amount of force on the fabric 31. The conductive strands may be formed of a metal (such as silver or copper) that may optionally be coated onto the polymer or fabric strands.
Although the conductive strands 28 are shown as straight strands in fig. 4B, this is merely illustrative. In some embodiments, the conductive strands 28 may be serpentine or have other desired shapes, which may allow the fabric 31 to have improved flexibility.
If desired, non-conductive strands 33 may be interspersed between at least some of the conductive strands 28. As shown in fig. 4B, every other strand may be a non-conductive strand. However, this is merely illustrative. Any desired number of conductive and non-conductive strands may be incorporated into the fabric 31 to form the force sensor 20.
The fabric 31 may form some or all of the nasal flaps 29 or may otherwise cover the nasal flaps 29. Because the nasal flap 29 is located between the nose 40 and the lens assembly 70 when the device 10 is worn by a user (fig. 2), the force sensor 20 may indicate the amount of force applied to the nose 40 by the lens assembly 70. Any desired number of force sensors 20 may be incorporated into the nasal flap 29 in this manner.
Alternatively or in addition, the fabric 31 may be otherwise incorporated into the apparatus 10. For example, the fabric 31 may form a curtain fabric that is incorporated into the apparatus 10 between the support structure 26-2 and the lens 70 (FIG. 2) to conceal the internal components. At least a portion of the curtain fabric with force sensor 20 may be located between the lens assembly 70 and the nose 40 (e.g., between the lens assembly 70 and the nose flap 29), and thus the force applied to the nose 40 by the lens assembly 70 may be measured. Generally, however, the fabric 31 may be incorporated into the apparatus 10 in any desired manner. Any desired number of force sensors 20 may be incorporated into the fabric 31.
In other examples, the force sensor 20 may be incorporated into the interior region of the nasal flap 29. An example of an arrangement in which the force sensor is located in the interior of the nasal flap 29 is shown in fig. 4C.
As shown in fig. 4C, the nasal flap 29 may have a peripheral region 29A surrounding an interior region 29B (also referred to herein as a chamber). Peripheral region 29A may be formed of a polymer, rubber, or any other desired material. The interior region 29B may be filled with air, gas, liquid, or any other desired substance. Force sensor 20 may be located in interior region 29B and may generate a force measurement in response to an increase in pressure in interior region 29B. For example, as the lens assembly 70 presses the nasal flap 29 against the nose 40 (FIG. 2), the pressure of air, gas, liquid, or other material within the nasal flap 29 may increase, thereby increasing the pressure on the force sensor 20. In some examples, force sensor 20 may be an atmospheric pressure sensor that measures the increased pressure within interior region 29B as lens assembly 70 moves against nose 40. Accordingly, force sensor 20 may generate a force measurement indicative of the force applied by lens assembly 70 to the nose of the user.
Although fig. 4C shows one side or portion of the nasal flap 29, this is merely illustrative. Any desired number of force sensors 20 may be incorporated into the interior region 29B of the nasal flap 29 on one or both sides of the user's nose. Additionally, while fig. 4C shows the force sensor 20 implemented as a pressure sensor in the nasal flap 29, the force sensor 20 may be a pressure sensor anywhere between the nose 40 and the lens assembly 70 when worn by a user. For example, if desired, force sensor 20 may be implemented as a pressure sensor within an edge portion of lens assembly 70 (i.e., in location 20' "of FIG. 3).
As an alternative to the sensor shown in fig. 4A-4C, the force sensor 20 may be implemented to measure deflection of the nasal flap 29 relative to the lens assembly 70. An example of this type of force sensor is shown in fig. 5.
As shown in fig. 5, force sensor 20 may be mounted to (or within) a portion of lens assembly 70. The force sensor 20 may be a sensor that measures the proximity of the nasal flap 29, such as a capacitive proximity sensor, a resistive proximity sensor, an optical proximity sensor, an ultrasonic proximity sensor, or any other desired type of proximity sensor. Alternatively, the magnet 32 may be embedded (or mounted) within the nasal flap 29, and the force sensor 20 may be implemented as a hall effect sensor. In this way, the hall effect sensor can determine the proximity of the magnet 32 and thus the nose flap 29. By measuring the proximity of the nasal flap 29 (i.e., the amount by which the nasal flap 29 has moved), the proximity sensor may provide an output indicative of the amount of force applied to the user's nose by the lens assembly 70.
While the previous embodiments have included a dedicated force sensor 20 to ensure that excessive force is not applied to the user's nose, this is merely illustrative. The device 10 may use other sensors, such as the sensor 16, to determine whether the force applied to the nose 40 exceeds a threshold. An example of such an arrangement is shown in fig. 6.
As shown in fig. 6, one or more flexible members 34 may extend from the lens assembly 70. The flexible member 34 may be formed of rubber, polymer, fabric, or any other desired flexible material. The device 10 may also include a light emitting component 36 and a light detecting component 42, which may be used for gaze tracking or other desired operations. For example, the light emitting part 36 may be an infrared light emitting part, and the light detecting part 42 may be an infrared light detecting part. To determine the gaze of the user of device 10, infrared light emitting component 36 may emit light toward the user's eyes, and the reflection from the user's eyes may be detected by infrared light detecting component 42. These reflections may indicate the direction of the user's gaze. However, in general, the light emitting component 36 and the light detecting component 42 may be any desired components and may operate in any desired wavelengths.
As the assembly 70 moves toward the nasal flap 29 (and thus toward the nose 40), it will eventually contact and urge the nasal flap 29 and the nose 40, thereby moving the flexible member 34 in the direction 38. If the flexible member 34 moves far enough (i.e., the amount of force applied to the nose 40 exceeds or reaches a threshold value), the flexible member 34 may block the light emitting component 36. Thus, the light detecting part 42 may stop detecting the light emitted by the light emitting part 36. Based on the varying signal from the light detection component 42, the control circuit 12 may prevent the positioner 58 from moving the lens assembly 70 further toward the nose of the user.
Although the flexible member 34 is shown as covering the light emitting component 36, this is merely illustrative. The flexible member 34 may cover the light detecting part 42 or may move only between the light emitting part 36 and the light detecting part 42. In addition, any desired number of flexible members 34 may be used.
In some embodiments, the light detection component 42 may be used without intervening flexible member 34 to determine the position of the user's nose. For example, the light detection component 42 may be a camera (e.g., a camera sensitive to visible light) that determines how close the lens assembly 70 is to the nose 40. Control circuitry such as control circuitry 12 (fig. 1) may process images or video captured by camera 42 to determine the position of lens assembly 70 relative to nose 40. If the lens assembly 70 is within a threshold distance of the nose 40 or the nose 40 has been depressed a threshold amount, the control circuit 12 may prevent the positioner 58 from moving the lens assembly 70 closer to the nose 40.
Alternatively or in addition, a three-dimensional scan may be performed on the user of the device 10 to ensure proper fit. The three-dimensional scanning may be performed by a three-dimensional sensor (such as a camera, infrared point projector, infrared sensor, etc.) in the device 10 or external to the device 10. A three-dimensional scan may be used to capture the topography of the user's face. The control circuit 12 may then use the topography of the user's face to ensure that the locator 58 does not move the lens assembly 70 too close to the nose 40 or apply too much force to the nose 40. For example, the control circuit 12 may use the topographical information to calculate a maximum distance that the lens assembly may move without applying too much force to the nose 40, and limit movement to that maximum distance.
Alternatively or in addition, the device 10 may include a manual button that the user may press to indicate discomfort due to excessive force on the nose 40. In response to detecting a manual button press, the positioner 58 may cease moving the lens assembly 70 toward the nose 40. If desired, the positioner 58 can also reverse the lens assembly 70 away from the nose 40 in response to manual button presses and/or excessive force detection. For example, the locator 58 can move the lens module 70 away from the nose 40 by a desired gap (e.g., a gap G of at least 0.1mm, at least 0.2mm, at least 1mm, at least 2mm, less than 5mm, or other suitable spacing).
Feedback from the motor in the positioner 58 can be used to measure the position of the lens module 70 relative to the corresponding surface of the nose 40 when the lens module 70 is moved into contact with the surface of the nose 40, if desired. Fig. 7 shows an exemplary control circuit for a positioner, such as positioner 58. The control circuit 12 (fig. 1) may include a motor controller such as controller 80. The controller 80 may drive the motor 86 in the positioner 58 by providing an electrical supply voltage Vin to the motor 86 using the path 84 to move the associated lens module 70. When a voltage Vin is provided to the motor 86, the controller 80 of the control circuit 12 uses the sensor circuit 82 (e.g., a current sense resistor with a corresponding analog-to-digital converter circuit, etc.) to monitor the resulting current (current I) through the path 84. The power supply voltage Vin may remain relatively constant as the motor 86 moves the lens assembly 70. The locator 58 can be used initially to locate the edge of the lens assembly 70 at a location remote from the nose 40. Control circuit 12 may then direct positioner 58 to move lens assembly 70 toward nose 40. The controller 80 of the control circuit 12 may monitor the current I flowing through the path 84 and I sensed by the sensor 82. As lens assembly 70 is pressed against the side of nose 40, current I will increase. When the current I exceeds a desired threshold (which is related to the force applied to the nose 40), the control circuit 12 may prevent the positioner 58 from applying a greater force to the nose 40 with the lens assembly 70.
Fig. 8 shows exemplary operations involved in operating the device 10 in the system 8.
During operation of block 100, information about the distance between the user's eyes (interpupillary distance IPD, sometimes referred to as pupillary distance) may be collected. For one exemplary arrangement, the device 10 or other equipment in the system 8 gathers the user's inter-pupillary distance by prompting the user to type the inter-pupillary distance into a data entry box on the display 14 or display in the other equipment in the system 8. The user may also use voice input or other user input arrangements to provide the user's inter-pupillary distance. For another exemplary arrangement, a sensor in device 10 or a separate computer, portable device, or other sensor in other devices in system 8 may measure the user's inter-pupillary distance. For example, a sensor such as a two-dimensional or three-dimensional image sensor may collect images of the user's face to measure the value of the inter-pupillary distance IPD. After the inter-pupillary distance has been measured, the inter-pupillary distance may be provided to device 10 (e.g., via a wired or wireless communication path). If desired, the gaze tracker may measure the position of the center PD of the user's eyes, thereby determining the IPD by direct measurement when the user wears the device 10 on the user's head.
After collecting the inter-pupillary distance IPD, control circuitry 12 of apparatus 10 may use positioner 58 to adjust the lens-to-lens spacing between lens centers LC during operation of block 102 so that the distance matches the inter-pupillary distance IPD and so that the centers of lenses 72 are aligned with the respective eye centers PC. Although the positioner 58 is moving the lens module 70 and the lens 72 (e.g., when the lens-to-lens spacing is being reduced to move the module 70 toward an adjacent surface of the user's nose), the control circuitry 12 uses a force sensing circuit (e.g., the force sensor 20) to monitor the force exerted by the lens module 70 on the nose 40. In some cases, the user nose 40 may prevent the lenses 72 from being sufficiently close to each other to allow the lens-to-lens spacing to accurately match the IPD without risking discomfort to the user.
In other words, the force sensor 20 may indicate that the force applied by the lens module 70 to the nose 40 is excessive or that the force applied by the lens module 70 to the nose 40 has reached a threshold. Alternatively or in addition, the device 10 may include a manual button that the user may press to indicate discomfort due to excessive force on the nose 40. In response to excessive force measured by force sensor 20 or user depression of a button, control circuit 12 may then prevent locator 58 from moving lens module 70 further toward nose 40. If desired, the locator 58 can move the lens module 70 away from the nose 40 by a desired gap (e.g., a gap G of at least 0.1mm, at least 0.2mm, at least 1mm, at least 2mm, less than 5mm, or other suitable spacing).
After positioning the module 70 at a desired position relative to the nose 40 to ensure comfort to the user while wearing the device 10, the control circuit 12 may use the display 14 to present visual content to the user through the lens 72 (block 104).
Alternatively or in addition to mounting the force sensor on the nasal flap, the force sensor may be coupled to a lens module. An illustrative example of a force sensor coupled to a lens module is shown in fig. 9.
As shown in fig. 9, the lens module 70 may include an optical module 69 (also referred to herein as an optical portion) surrounded by a decorative ring 90. The trim ring 90 may be a metal, plastic or other material trim ring that surrounds the optical module 69. In addition, the lens module 70 may have an intermediate decoration 71 between the optical portion 69 and the decoration ring 90. To mount the force sensor to the lens module 70, the blades 92A to 92D may be mounted between the intermediate trim 71 and the trim ring 90, while the blades 94A to 94D may be mounted between the optical portion 69 and the intermediate trim 71. In particular, the flexible structures 92 and 94 (also referred to herein as flexible structures and blade structures) may be formed of metal, plastic, other polymers, or other desired materials. In one illustrative example, the flexible structures 92 and 94 may be formed from a flexible sheet of metal, such as steel or other metal having a low thickness (such as less than 300 microns, less than 260 microns, less than 250 microns, or other suitable thickness)
The blades 92 may extend in the x-direction between the intermediate trim 71 and the trim ring 90. The vane 94 may extend in the y-direction between the optic portion 69 and the intermediate trim 71. To measure the force applied to the lens module 70, such as when the lens module 70 contacts the nose of a user when adjusting based on the IPD, strain gauges 96A and 96B may be coupled to the blades 92 and 94. In the example of FIG. 9, the strain gauge 96A is coupled to the blade 92B and the strain gauge 96B is coupled to the blade 94B. However, this arrangement is merely illustrative. In general, the strain gauge 96 may be coupled to either of the flexible structures 92 and 94. Furthermore, more than two strain gauges 96 may be used if desired. In some embodiments, all of the flexible structures 92 and 94 may have an associated strain gauge 96. Alternatively, only one of the flexible structures 92 or 94 may have a strain gauge 96 if desired.
When the lens module 70 contacts the nose of the user, the trim ring 90 and/or the intermediate trim piece 71 may deflect, causing deflection of the blades 92 and/or the blades 94. The strain gauge 96 may measure these deflections, which may be proportional to the force applied to the nose. In this way, the strain gauge 96 may monitor the force applied to the nose by the lens module 70. If the force is too great, a control circuit such as control circuit 12 may prevent the lens module from moving toward the user's nose and, if desired, may be reversed away from the user's nose a set distance. In this manner, strain gauges on blades 92 and/or 94 may monitor the force applied to the nose of the user to avoid excessive pressure on the nose.
Although fig. 9 shows the flexible structure 92 between the intermediate decorative piece 71 and the decorative ring 90 and the flexible structure 94 between the optical portion 69 and the intermediate decorative piece 71, this is merely illustrative. If desired, the vanes 92 may extend between the intermediate trim 71 and the optic portion 69, and the vanes 94 may extend between the intermediate trim 71 and the trim ring 90. Regardless of the arrangement of blades 92 and 94, a strain gauge 96 may be coupled to the blades to detect whether the lens module 70 is applying excessive pressure to the nose.
Although fig. 9 shows blades 92 and 94 extending in the x and y directions, this is merely illustrative. In general, blades 92 and 94 may extend in any desired direction, including the z-direction or an intermediate direction between the x, y, and/or z-directions. Further, although fig. 9 shows the blades 92 and 94 between the optical portion 69, the intermediate trim 71 and the trim ring 90, this is merely illustrative. In some embodiments, intermediate decorative piece 71 may be omitted, and vanes 92 and 94 may extend directly between optical portion 69 and decorative ring 90.
Alternatively or in addition to having strain gauges within the lens module 70, strain gauges may be coupled to an attachment mechanism between the lens module 70 and the headset housing 26. An illustrative example of such an arrangement is shown in fig. 10.
As shown in fig. 10, the device 10 may include an attachment structure 98 (also referred to herein as a coupler 98) that may couple the lens module 70 to a head-mounted housing, such as the housing 26 of fig. 2. Coupler 98 may be attached to housing 26 with an accessory 106. In some embodiments, the accessory 106 may include pins, rods, or other mechanisms that allow the lens module 70 to slidably move relative to the head-mounted housing 26. In general, however, accessory 106 may include any desired component for attaching lens module 70 to housing 26.
Coupler 98 may also include a flexible structure 108 (also referred to herein as a blade 108) extending between optical module 70 and accessory 106. The flexible structure 108 may be formed of metal, plastic, or other material that flexes when a force is applied to the lens module 70 (e.g., because the accessory 106 remains stationary). For example, blade 108 may be formed of steel or other metal having a small thickness (such as less than 300 microns, less than 260 microns, less than 250 microns, or other suitable thickness).
The strain gauge 110 may be mounted on at least one of the blades 108. The strain gauge 110 may measure the strain of the blade 108 when a force is applied to the lens module 70, which is proportional to the force on the lens module 70. In this way, the force exerted by the lens module 70 on the nose can be monitored and if excessive force is applied, the compression of the nose by the lens module 70 can be prevented.
While strain gauge 110 is shown on only one of blades 108, this is merely illustrative. In general, the strain gauge 110 may be applied to any or all of the blades 108.
Instead of mounting the strain gauge 110 on the blade 108 in the coupler 98, the coupler 98 may include a spring structure and a load cell or position sensor to determine the amount of force applied to the nose by the module 70. An exemplary arrangement of springs and force sensing components is shown in fig. 11A-11D.
As shown in fig. 11A, coupler 98 may include a structure 112 that may be coupled to the top of module 70 and an extension spring 116 coupled to structure 112. Specifically, the extension spring 116 may extend from a component 118 to the load cell 114, which may correspond to the accessory 106 of fig. 10 (e.g., an accessory between the coupler 98 and the head-mounted housing). The load cell 114 may monitor the force exerted by the spring 114. Because spring 116 is coupled to component 118 and/or structure 112 (which is coupled to the optical module), load cell 114 may be triggered when a force exceeding a desired threshold (determined by spring 114) is applied to the nose. Specifically, the force is proportional to the force exerted by the module 70 on the nose. In this manner, the output of the load cell 114 may be monitored and if excessive force is detected (e.g., when the load cell 114 is triggered), the module 70 may be prevented from applying additional pressure to the nose.
Instead of using an extension spring as in fig. 11A, a compression spring 120 (fig. 11B) may be used or a double spring with segments 122 and 124 (fig. 11C) may be used. Regardless of the type of spring used, the output of the load cell 114 may be monitored and if excessive force is detected, the module 70 may be prevented from applying additional pressure to the nose.
As another example, a spring interferometer may be used, as shown in fig. 11D. As shown in fig. 11D, coupler 98 may include a spring 128 coupled to member 118. Interferometer 126 can measure the force applied to the nose by module 70 by monitoring the position of component 118.
A physical environment refers to a physical world that people can sense and/or interact with without the assistance of electronic devices. The physical environment may include physical features, such as physical surfaces or physical objects. For example, the physical environment corresponds to a physical park that includes physical trees, physical buildings, and physical people. People can directly sense and/or interact with a physical environment, such as by visual, tactile, auditory, gustatory, and olfactory senses. Conversely, an augmented reality (XR) environment refers to a fully or partially simulated environment in which people sense and/or interact via electronic devices. For example, the XR environment may include Augmented Reality (AR) content, mixed Reality (MR) content, virtual Reality (VR) content, and the like. In the case of an XR system, a subset of the physical movements of a person, or a representation thereof, are tracked and in response one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner consistent with at least one physical law. As one example, the XR system may detect head movements and, in response, adjust the graphical content and sound field presented to the person in a manner similar to the manner in which such views and sounds change in the physical environment. As another example, an XR system may detect movement of an electronic device (e.g., mobile phone, tablet computer, laptop computer, etc.) presenting an XR environment, and in response, adjust the graphical content and sound field presented to a person in a manner similar to how such views and sounds would change in a physical environment. In some cases (e.g., for reachability reasons), the XR system may adjust characteristics of graphical content in the XR environment in response to representations of physical movements (e.g., voice commands).
There are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include wearable systems, projection-based systems, head-up displays (HUDs), vehicle windshields integrated with display capabilities, windows integrated with display capabilities, displays formed as lenses designed for placement on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablet computers, and desktop/laptop computers. The head-mountable system may have one or more speakers and an integrated opaque display. Alternatively, the head-mountable system could be configured to accept an external opaque display (e.g., a smart phone). The head-mountable system could incorporate one or more imaging sensors for capturing images or video of the physical environment and/or one or more microphones for capturing audio of the physical environment. The head-mountable system could have a transparent or translucent display instead of an opaque display. A transparent or translucent display may have a medium through which light representing an image is directed to a person's eye. The display may utilize digital light projection, OLED, LED, uLED, liquid crystal on silicon, laser scanning light sources, or any combination of these techniques. The medium may be an optical waveguide, a holographic medium, an optical combiner, an optical reflector, or any combination thereof. In some implementations, the transparent or translucent display may be configured to selectively become opaque. Projection-based systems may employ retinal projection techniques that project a graphical image onto a person's retina. The projection system may also be configured to project the virtual object into the physical environment, for example as a hologram or on a physical surface.
As described above, one aspect of the present technology is to collect and use information, such as sensor information. The present disclosure contemplates that in some cases, data may be collected that includes personal information data that uniquely identifies or can be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, telephone numbers, email addresses, tweets IDs, home addresses, data or records related to the user's health or fitness level (e.g., vital sign measurements, medication information, exercise information), birth date, user name, password, biometric information, or any other identification or personal information.
The present disclosure recognizes that the use of such personal information in the present technology may be used to benefit users. For example, the personal information data may be used to deliver targeted content of greater interest to the user. Thus, the use of such personal information data enables a user to have control over the delivered content. In addition, the present disclosure contemplates other uses for personal information data that are beneficial to the user. For example, health and fitness data may be used to provide insight into the general health of a user, or may be used as positive feedback to individuals who use technology to pursue health goals.
The present disclosure contemplates that entities responsible for the collection, analysis, disclosure, transmission, storage, or other use of such personal information data will adhere to sophisticated privacy policies and/or privacy measures. In particular, such entities should exercise and adhere to the use of privacy policies and measures that are recognized as meeting or exceeding industry or government requirements for maintaining the privacy and security of personal information data. Such policies should be convenient for the user to access and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable physical uses and must not be shared or sold outside of these legitimate uses. Furthermore, such collection/sharing should be done after receiving the user's informed consent. In addition, such entities should consider taking any necessary steps for protecting and securing access to such personal information data and ensuring that other entities having access to the personal information data adhere to their privacy policies and procedures. Moreover, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and measures. In addition, policies and measures should be adapted to the particular type of personal information data collected and/or accessed and to applicable laws and standards including consideration of a particular jurisdiction. For example, in the united states, the collection or access to certain health data may be governed by federal and/or state law, such as the health insurance and liability act (Insurance Portability and Accountability Act, HIPAA), while health data in other countries may be subject to other regulations and policies and should be processed accordingly. Thus, different privacy measures should be claimed for different personal data types in each country.
Regardless of the foregoing, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates providing hardware elements and/or software elements to prevent or block access to such personal information data. For example, the present technology may be configured to allow a user to choose to participate in the collection of personal information data "opt-in" or "opt-out" during or at any time after the registration service. In another example, the user may choose not to provide certain types of user data. In yet another example, the user may choose to limit the length of time that user-specific data is maintained. In addition to providing the "opt-in" and "opt-out" options, the present disclosure contemplates providing notifications related to accessing or using personal information. For example, a user may be notified that his personal information data will be accessed when an application ("app") is downloaded, and then be reminded again before the personal information data is accessed by the app.
Furthermore, it is intended that personal information data should be managed and processed in a manner that minimizes the risk of inadvertent or unauthorized access or use. Once the data is no longer needed, risk can be minimized by limiting data collection and data deletion. Furthermore, and when applicable, including in certain health-related applications, data de-identification may be used to protect the privacy of the user. De-identification may be facilitated by removing a particular identifier (e.g., date of birth, etc.), controlling the amount or characteristics of data stored (e.g., collecting location data at a city level rather than an address level), controlling the manner in which data is stored (e.g., aggregating data between users), and/or other methods, where appropriate.
Thus, while the present disclosure broadly contemplates the use of information that may include personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that the various embodiments may be implemented without requiring access to the personal information data. That is, various embodiments of the present technology do not fail to function properly due to the lack of all or a portion of such personal information data.
According to an embodiment, a head mounted device is provided that includes a display having a first display portion and a second display portion, control circuitry configured to supply content using the display, a first lens assembly and a second lens assembly, the first lens assembly including the first display portion and the second lens assembly including the second display portion, positioning circuitry configured to adjust a lens-to-lens spacing between the first lens assembly and the second lens assembly, and sensor circuitry configured to collect force sensor information in response to the first lens assembly and the second lens assembly applying a force to a nose.
According to another embodiment, the head-mounted device includes a nasal flap configured to separate the first and second lens assemblies from the nose.
According to another embodiment, the sensor circuit includes a first sensor interposed between the nasal flap and the first lens assembly and a second sensor interposed between the nasal flap and the second lens assembly.
According to another embodiment, the first sensor and the second sensor are direct force sensors.
According to another embodiment, the direct force sensor includes a resistor having interdigitated traces.
According to another embodiment, the first sensor and the second sensor are proximity sensors configured to measure the proximity of the nasal flap to the first lens assembly and the second lens assembly.
According to another embodiment, the headset includes a magnet located in the nasal flap, the first sensor and the second sensor are hall effect sensors configured to measure proximity of the magnet.
According to another embodiment, the sensor circuit includes a camera configured to capture an image, and the control circuit is configured to determine the position of the first lens assembly and the second lens assembly relative to the nose based on the image.
According to another embodiment, the positioning circuit is configured to stop moving the first and second lens assemblies toward the nose in response to the control circuit determining that the distance between the first and second lens assemblies and the nose is less than a threshold.
According to another embodiment, the control circuit is configured to determine a maximum distance the first and second lens assemblies can move toward the nose based on a three-dimensional facial scan, and the positioning circuit is configured to stop moving the first and second lens assemblies toward the nose after moving the maximum distance.
According to another embodiment, the nasal flap comprises a material surrounding a chamber, and the sensor circuit comprises a pressure sensor within the chamber.
According to another embodiment, the pressure sensor is an atmospheric pressure sensor.
According to another embodiment, the sensor circuit comprises a light emitting component and a light detecting component, and the system further comprises a flexible member extending from the first and second lens assemblies, the flexible member configured to move between the light emitting component and the light detecting component when the force exceeds a threshold.
According to another embodiment, the positioning circuit comprises a motor and the sensor circuit comprises a sensor that measures the voltage and current of the motor to determine the force.
According to another embodiment, the positioning circuit is configured to move the first and second lens assemblies toward the nose and is configured to stop moving the first and second lens assemblies toward the nose in response to the force exceeding a threshold.
According to another embodiment, the positioning circuit is further configured to move the first and second lens assemblies away from the nose movement gap in response to the force exceeding the threshold.
According to another embodiment, the sensor circuit is coupled to the first lens assembly and the second lens assembly and configured to collect nose contact force information.
According to another embodiment, the headset includes a flexible structure coupled to the first lens assembly, and the sensor circuit includes a strain gauge coupled to the flexible structure.
According to another embodiment, the lens-to-lens spacing is configured to be adjusted in a first direction and the flexible structure extends in a second direction perpendicular to the first direction.
According to another embodiment, the flexible structure is a first flexible structure and the strain gauge is a first strain gauge, the headset further comprising a second flexible structure extending along the first direction, the sensor circuit further comprising a second strain gauge coupled to the second flexible structure.
According to another embodiment, the first and second lens assemblies include first and second trim pieces, respectively, the first and second flexible structures extending from an optical portion of the first lens assembly to the first trim piece.
According to another embodiment, the headset includes a third flexible structure extending in a third direction different from the first direction and the second direction, the sensor circuit further including a third strain gauge coupled to the third flexible structure.
According to another embodiment, the head-mounted device includes a head-mounted housing, a first coupler and a second coupler, the first and second couplers attach the first and second lens assemblies to the head-mounted housing, respectively, and the sensor circuit is coupled to the first and second couplers.
According to another embodiment, the headset includes a flexible structure coupled to the first coupler and the second coupler, and the sensor circuit includes a strain gauge coupled to the flexible structure.
According to another embodiment, the lens-to-lens spacing is configured to be adjusted in a first direction and the flexible structure extends in a second direction perpendicular to the first direction.
According to another embodiment, the head mounted device includes a spring coupled to the first coupler and the second coupler, and a weight element coupled to the spring, the weight element configured to measure the force.
According to another embodiment, the spring is a tension spring.
According to another embodiment, the spring is a compression spring.
According to a further embodiment, the spring comprises a double spring coupled to each of the first coupler and the second coupler.
According to another embodiment, the headset includes a spring coupled to the first and second couplers, and a sensor coupled to the first and second couplers, the sensor configured to determine a distance between the sensor and the spring to determine the force.
According to an embodiment, a head-mounted device is provided that includes first and second pixel arrays configured to display content, left and right positioners, left and right lens assemblies positioned by the left and right positioners, respectively, the left lens assembly including left and first pixel arrays and the right lens assembly including right lenses and the second pixel arrays, a nose flap configured to separate a nose from the left and right lens assemblies, a left force sensor adjacent to the left lens assembly, a right force sensor adjacent to the right lens assembly, and control circuitry configured to position the left and right lens assemblies using the left and right positioners based on information from the left and right force sensors.
According to another embodiment, the left force sensor and the right force sensor are interposed between the nasal flap and the left lens assembly and between the nasal flap and the right lens assembly, respectively.
According to another embodiment, the left force sensor and the right force sensor are integrated into the nasal flap.
According to another embodiment, the left force sensor and the right force sensor are interposed between the nasal flap and the nose.
According to an embodiment, a head-mounted device is provided that includes a display, a lens assembly including a portion of the display, the lens assembly including an optical portion and a trim ring extending around the optical portion, a plurality of blades coupled to the optical portion and the trim ring, a positioning circuit configured to move the lens assembly, and a strain gauge located on the plurality of blades, the strain gauge configured to collect force sensor measurements.
According to another embodiment, the lens assembly further comprises an intermediate trim piece positioned between the optical portion and the trim ring, a first blade of the plurality of blades being coupled to the optical portion and the intermediate trim piece, and a second blade of the plurality of blades being coupled to the intermediate trim piece and the trim ring.
According to another embodiment, a first blade of the plurality of blades extends in a first direction, a second blade of the plurality of blades extends in a second direction perpendicular to the first direction, at least one of the strain gauges is located on one of the first blades, and at least another one of the strain gauges is located on one of the second blades.
According to another embodiment, each of the plurality of blades is coupled to an associated one of the strain gauges.
The foregoing is merely illustrative and various modifications may be made to the embodiments. The foregoing embodiments may be implemented alone or in any combination.