Detailed Description
Reference will now be made in detail to the exemplary embodiments illustrated in the drawings. It should be understood that the following description is not intended to limit the embodiments to one preferred embodiment. On the contrary, it is intended to cover alternatives, modifications and equivalents as may be included within the spirit and scope of the embodiments as defined by the appended claims.
The following description relates to the configuration and operation of an SMI-based gesture input system (i.e., a system that may use signals received from one or more SMI sensors to recognize gestures made by a user). The SMI sensor may be used to optically measure relative motion (displacement) between the SMI sensor and a target (e.g., a surface or object) using sub-wavelength resolution. When the displacement measurement is correlated with the measurement time, the velocity of the target may also be measured. Further, by modulating the SMI sensor with a known wavelength modulation (e.g., triangular modulation), the absolute distance from the SMI sensor to the target may be measured.
In Augmented Reality (AR), virtual Reality (VR), and Mixed Reality (MR) applications, among others, it may be useful to track a user's finger movements and/or to recognize gestures of the user (e.g., gestures made with one or more fingers, hands, arms, etc.). In some applications, it may be useful for a user to be able to provide input to the system by interacting with a surface (e.g., making gestures on any random surface such as a desktop, wall, or paper) or by making gestures in free space. In such applications, the SMI-based gesture input system may be used to track the user's finger movements with reference to any surface (in some cases, including the surface of another finger, the palm of the user, etc.).
An SMI-based gesture input system and a device that may be worn or held by a user are described herein. Some of the systems include a single wearable device or a handheld device. Other systems may include two or more wearable devices and/or handheld devices. The system may be provided with more or fewer SMI sensors that typically enable finer or lower resolution tracking, or more or less complex gesture detection/recognition. For example, scrolling along a single axis may be detected using one SMI sensor. With two SMI sensors, user motion in a plane may be tracked. With three or more SMI sensors, movement in the x, y, and z directions may be tracked. Motion tracking with six degrees of freedom may also be tracked with three or more SMI sensors, and in some cases by modulating the SMI sensors in a particular or different manner.
In contrast to conventional optical tracking methods such as optical flow and spot tracking, SMI-based tracking methods can block ambient light (e.g., sunlight or other ambient light) and track motion with six degrees of freedom without the need for additional sensors for determining distance to the target surface. The SMI-based gesture input system may also be used in a darkened room (e.g., a room without ambient light).
These and other techniques described with reference to fig. 1-14. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.
Directional terms such as "top", "bottom", "upper", "lower", "front", "rear", "above", "below", "left", "right" and the like are used with reference to the orientation of some of the components in some of the figures described below. Because components in various embodiments can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration and is in no way limiting. Directional terminology is intended to be interpreted broadly and therefore should not be interpreted as excluding the components that are oriented in a different manner. Use of alternative terms such as "or" is intended to mean different combinations of alternative elements. For example, a or B is intended to include a or B, or a and B.
FIG. 1 illustrates an exemplary SMI-based gesture input system 100. The system 100 includes a device housing 102, a set of one or more SMI sensors 104 mounted within the device housing 102, a processing system 106 mounted within the device housing 102, and/or a communication interface 108 mounted within the device housing 102.
The device housing 102 may take various forms and, in some cases, may be configured to be worn or held by the user 110. When the device housing 102 is configured to be worn by the user 110, the device housing 102 may define a wearable device, such as a ring, whole or partial glove, sleeve, or the like. When the device housing 102 is configured to be held by the user 110, the device housing 102 may define a stylus, another writing instrument (e.g., a pen or pencil), any object, and so forth. In any event, the device housing 102 may be made of a variety of materials (such as plastic, metal, or ceramic materials). In some cases, the device housing 102 may include multiple components, such as first and second rings that snap together or otherwise (e.g., by adhesive or solder), first and second semicircle tubes that snap together or otherwise (e.g., by adhesive or solder), or one or more pieces that define an open partial circle having one or more open ends that are plugged by a cap.
Each of the SMI sensors 104 may include a source of electromagnetic radiation. The electromagnetic radiation source may include a resonant cavity from which the electromagnetic radiation beam 112 is emitted. The electromagnetic radiation beam 112 may comprise a coherent (or partially coherent) mixture of 1) electromagnetic radiation generated by an electromagnetic radiation source, and 2) electromagnetic radiation received into a resonant cavity of the electromagnetic radiation source after reflection or backscatter from the surface 114. Each of the SMI sensors 104 may include a photodetector that generates an SMI signal 116 containing information about the relationship between the SMI sensor 104 and the surface 114. The SMI signal 116 generated by the SMI sensor 104 contains information corresponding to information contained in the electromagnetic radiation waveform received by the SMI sensor 104. Alternatively, SMI sensor 104 may output a measurement of the current or junction voltage of its electromagnetic radiation source as SMI signal 116.
The one or more SMI sensors 104 may emit a set of one or more beams of electromagnetic radiation 112. Different beams 112 may be emitted in different directions. In some cases, some or all of the bundles 112 may be emitted in a direction extending away from a first surface of the user 110 (e.g., away from a surface of the user 110 on which the device housing 102 is worn). Some (or all) of beams 112 may be emitted toward a second surface (e.g., surface 114). The SMI signal 116 generated by the set of one or more SMI sensors 104 may contain information not only about the relationship between the individual SMI sensors 104 and the surface 114, but also about the relationship between the equipment enclosure 102 and the surface 114, and thus about the position, orientation, or movement of the user 110 who is wearing or holding the equipment enclosure 102.
Processing system 106 may include, for example, one or more digital-to-analog converters 118 (ADCs) (e.g., one ADC 118 per SMI sensor 104), a processor 120, and/or other components for digitizing the SMI signal 116 output by SMI sensors 104. In some cases, processing system 106 may include filters, amplifiers, or other discrete circuitry for processing SMI signal 116. The processor 120 may take various forms, such as a form of a microprocessor, microcontroller, application Specific Integrated Circuit (ASIC), or the like.
Processor 120 may be configured to extract the relationship between equipment enclosure 102 and surface 114 from digitized samples of the one or more SMI signals 116. When system 100 includes only one SMI sensor 104, or when processor 120 uses only one SMI signal 116, processor 120 may determine, for example, movement of equipment enclosure 102 along the axis of emission beam 112 of the SMI sensor (e.g., in the x, y, or z directions of a cartesian coordinate system) (and thus movement of user 110). When system 100 includes only two SMI sensors 104, or when processor 120 uses only two SMI signals 116, processor 120 may determine, for example, movement of equipment enclosure 102 in a plane (e.g., in the xy, xz, or yz plane of a cartesian coordinate system, assuming beam 112 is tilted (i.e., not perpendicular or parallel) to the plane) (and thus movement of user 110). When system 100 includes only at least three SMI sensors 104, or when processor 120 uses at least three SMI signals 116, processor 120 may determine, for example, movement of equipment enclosure 102 in free space (e.g., in xyz space of a cartesian coordinate system) (and thus movement of user 110).
When system 100 includes two or three SMI sensors 104, the beam 112 emitted by the SMI sensors 104 preferably has an orthogonal axis that decouples the SMI signal 116 to improve sensitivity and minimize errors, and this simplifies the processing burden (i.e., computational burden) placed on the processor 120. However, if the angle between the beam 112 and the direction of the measured displacement is known, the beam 112 need not have orthogonal axes. When system 100 generates more SMI signals 116 than are required by processor 120, or when system 100 includes more than three SMI sensors 104, processor 120 may analyze digitized samples of the plurality of SMI signals 116 and identify (based at least in part on the analysis) at least one SMI signal of the plurality of SMI signals 116 from which the relationship between equipment enclosure 102 and surface 114 is extracted. In the latter case, it is recognized that in some cases, the equipment enclosure 102 may be positioned differently such that its SMI sensor 104 may emit the electromagnetic radiation beam 112 in a direction that is not useful, or in a direction that results in different beams 112 impinging on different surfaces. Thus, the processor 120 may analyze digitized samples of the plurality of SMI signals 116 to determine which SMI signals 116 appear to contain useful information about the same surface (e.g., the processor 120 may be programmed to assume that SMI signals 116 indicating that the surface is within a threshold distance are generated by SMI sensors 104 facing the palm or other nearby body parts of the user, and then ignore those SMI signals 116. Alternatively, the user 110 of the system may position the equipment enclosure 102 such that their SMI sensors 104 emit the electromagnetic radiation beam 112 in a useful direction.
In some embodiments, the processor 120 may be configured to transmit information indicative of the relationship between the device housing 102 and the surface 114 using the communication interface 108. The information may be transmitted to a remote device. In some cases, the transmitted information may include a sequence of time-dependent measurements, or a sequence of time-dependent positions, orientations, or movements. In other cases, processor 120 may be configured to recognize one or more gestures made by user 110 and transmit indications of the one or more gestures (these indications being in the form of information indicating the relationship between device housing 102 and surface 114). Processor 120 may identify a gesture of user 110 by comparing a sequence of changes in one or more SMI signals 116 obtained from one or more SMI sensors 104 to one or more stored sequences that have been associated with the one or more gestures. For example, the processor 120 may compare the sequence of changes of the SMI signal 116 to a stored sequence corresponding to the press or stamp, and upon determining a match (or determining that the sequences are sufficiently similar to indicate a match), the processor 120 may indicate that the user 110 has made a gesture of the press or stamp. Similarly, upon comparing the changing sequence of a set of SMI signals 116 to a set of stored sequences corresponding to the user 110 writing the letter "A" or a set of stored sequences corresponding to the user 110 making a circular motion, and determining a match to one of these gestures, the processor 120 may indicate to the user 110 that the letter "A" has been drawn or that a circular gesture has been made. In addition to or instead of comparing the sequence of changes of the one or more SMI signals 116 to stored sequences of changes, the processor 120 may determine a set of time dependent locations, orientations, motion vectors, or other pieces of information in one, two, or three dimensions from the sequence of changes of the one or more SMI signals 116 and may compare the alternative information to stored information that has been associated with one or more predetermined gestures.
When determining the movement of the device housing 102 relative to the surface 114, there is ambiguity between displacement and rotation when using only a sequence of three time dependent measurements. This is because the characterization of motion in a cartesian coordinate system requires the characterization of six degrees of freedom (6 DoF). The 6DoF characterization requires six characterizations of unknown quantities, which therefore requires a sequence of six time-dependent measurements—for example, not only measurements of displacement along three axes (x-axis, y-axis, and z-axis), but also rotation (e.g., yaw, pitch, and roll) about each of the three axes. In other words, the processor 120 cannot solve for six unknowns using a sequence of only three time-dependent measurements. To provide a sequence of three additional time-dependent measurements, processor 120 may use SMI signals 116 obtained by six different SMI sensors 104 that emit beams 112 directed in six different directions toward surface 114. Alternatively, processor 120 may obtain a sequence of two or more time-dependent measurements from each of the fewer number of SMI sensors 104. For example, processor 120 may alternatively modulate the input of each of a set of three SMI sensors 104 using a sinusoidal waveform and a triangular waveform, and obtain a sequence of time-dependent measurements for each modulation type from each of the three SMI sensors 104 (e.g., processor 120 may modulate the input of each SMI sensor 104 using a sinusoidal waveform during a first set of time periods, and modulate the input of each SMI sensor 104 using a triangular waveform during a second set of time periods). Modulating the input with a triangular waveform may provide absolute distance measurements, which may not be obtainable using sinusoidal waveform modulation.
Communication interface 108 may include a communication interface operable to communicate with a remote device (e.g., a wired and/or wireless communication interface through which a mobile phone, electronic watch, tablet or laptop computer communicates (e.g.,Low power consumption (BLE), wi-Fi, or Universal Serial Bus (USB) interface).
Fig. 2 and 3 illustrate examples of SMI-based gesture input systems, which may be embodiments of the systems described with reference to fig. 1. FIG. 2 illustrates an exemplary SMI-based gesture input system in the form of a closed loop 200. The closed loop 200 may be configured to receive a finger 202 of a user (i.e., the closed loop 200 may be a finger loop). A set of SMI sensors 204 housed within the closed loop 200 may emit a beam 206 of electromagnetic radiation through an aperture and/or window element transparent to the wavelength of the emitted beam 206. By way of example, the closed loop 200 includes three SMI sensors 204 that emit orthogonal beams of electromagnetic radiation 206. In alternative embodiments, the closed loop 200 may include more or less SMI sensors 204 that emit orthogonal or non-orthogonal beams 206 of electromagnetic radiation.
FIG. 3 illustrates an exemplary SMI-based gesture input system in the form of an open loop 300. The open loop 300 may be configured to receive a finger 302 of a user (e.g., the open loop 300 may be a ring). The open loop 300 may include SMI sensors 304 disposed to emit a beam 306 of electromagnetic radiation along the open loop body 308 and/or from one or both ends 310, 312 of the open loop body 308 (e.g., from covers at the ends 310, 312 of the open loop body 308). By way of example, the open loop 300 includes three SMI sensors 304 that emit orthogonal beams of electromagnetic radiation 306. In alternative embodiments, open loop 300 may include more or less SMI sensors 304 that emit orthogonal or non-orthogonal beams of electromagnetic radiation 306. While the SMI sensors 304 are shown in FIG. 3 as being proximate to both ends 310, 312 of the open loop 300, alternatively, all SMI sensors 304 (or more or less SMI sensors 304) may be disposed proximate to one end of the open loop 300.
As shown in fig. 3, open loop may be useful because it may not block the inner surface of the user's hand, which in some cases may improve the user's ability to grasp an object, feel texture on a surface, or receive tactile output provided via a surface.
In some embodiments, the wearable device described with reference to any of fig. 1-3 may determine the absolute distance, direction, and speed of a surface relative to the SMI sensor by triangulating the input to the SMI sensor, as described with reference to fig. 10 and 11. The displacement of the surface can then be obtained by integrating the velocity. In some embodiments, the wearable device may use I/Q demodulation to determine the displacement and direction of the surface relative to the SMI sensor (in the time domain), as described with reference to fig. 12. The absolute distance can then be obtained using triangular modulation.
In some cases, a wearable device such as a finger ring may include a deformable or compressible insert that enables the finger ring to be worn farther from or closer to the user's fingertip.
In some cases, the ring may be rotated by the user such that the ring may alternately sense a surface under the user's hand, a surface of an object held by the user, an adjacent finger, and the like.
In some cases, the wearable device may include sensors other than SMI sensors, such as Inertial Measurement Units (IMUs). In some cases, additional sensors may also be used to characterize motion. The wearable device may also include a haptic engine to provide haptic feedback to a user, battery, or other component.
Fig. 4 illustrates a wearable device 400 having a set of SMI sensors 402 from which a processor of device 400 may select subset 404 to determine a relationship between wearable device 400 and surface 406. Alternatively, the processor of device 400 may use the SMI signals generated by different subsets 404, 408 of SMI sensors 402 to determine the relationship between wearable device 400 and different surfaces 406, 410 (e.g., desktop 406 and a user's finger 410 adjacent to the finger on which device 400 is worn). By way of example, the wearable device 400 is shown as a closed finger ring (e.g., a wearable device having a form factor similar to that of the closed ring described with reference to fig. 2). In alternative embodiments, the apparatus 400 may take other forms.
In fig. 4, SMI sensors 402 are grouped into subsets of three SMI sensors 402, and the subsets are located at different locations around the circumference of device 400. In other embodiments, a subset of SMI sensors 402 may have a different number of SMI sensors 402 (in some cases, only one SMI sensor 402 is included). In some embodiments, SMI sensors 402 may not be arranged in discrete subsets, and the processor of apparatus 400 may analyze the SMI signals received from SMI sensors 402 and dynamically identify a subset of one or more of SMI sensors 402 in response to analyzing the SMI signals. The processor may also determine that one or more of the SMI sensors did not generate a useful SMI signal and exclude those SMI sensors from inclusion in any subset (and in some cases, those SMI sensors may not be used until a change in the SMI signal of those SMI sensors is identified).
In some embodiments of the device 400 (or in embodiments of other devices described herein), the device 400 may include one or more sensors for determining an orientation of the device 400 relative to its user (e.g., relative to a finger on which the device 400 is worn, one or more adjacent fingers, a palm of the user, etc.) or a surface (e.g., a desktop, paper, wall, surface of the user's body, etc.). The sensors may include, for example, one or more of a proximity sensor, a contact sensor, a pressure sensor, an accelerometer, an IMU, and the like.
FIG. 5 illustrates another exemplary SMI-based gesture input system 500. In contrast to the system described with reference to fig. 1, system 500 may include more than one device. For example, system 500 may include a wearable device 502 configured to be worn by a user, and an object 504 configured to be held by the user.
In some embodiments, wearable device 502 may be configured similar to the wearable device described with reference to fig. 1, and may include a device housing 506, a set of one or more SMI sensors 508 mounted within device housing 506, a processing system 510 mounted within device housing 506, and/or a communication interface 512 mounted within device housing 102. Equipment housing 506, SMI sensor 508, processing system 510, and/or communication interface 512 may be configured similar to the same components described with reference to fig. 1. In some embodiments, wearable device 502 may be a ring, such as described with reference to fig. 2 or 3.
In some implementations, the object 504 may be shaped as one or more of a stylus, pen, pencil, marker, or brush. The object 504 may also take other forms.
In some cases, one or more of SMI sensors 508 in wearable device 502 may emit beam of electromagnetic radiation 514 that impinges on object 504. As object 504 is moved by a user, such as for writing or drawing, the relationship between wearable device 502 and object 504 may be changed. Processing system 510 may extract information regarding the time-varying relationship between wearable device 502 and object 504 (and/or information regarding the time-varying relationship between wearable device 502 and a surface other than the surface of object 504) from the SMI signal of SMI sensor 508, and in some cases may identify one or more gestures made by the user. In some cases, the gesture may include an alphanumeric string (one or more characters) written by the user. In these cases, processing system 510 may be configured to identify an alphanumeric string from information regarding a time-varying relationship between wearable device 502 and object 504. SMI sensor 508 may also or alternatively be used to determine whether a user is holding object 504 and to track or predict movement of object 504. For example, if object 504 is a writing instrument (e.g., a pen), the SMI signal generated by SMI sensor 508 may be analyzed to determine if the user is holding object 504, and in some cases, whether the user is loosely or tightly holding object 504. The processing system 510 may determine whether the user is about to write, make a gesture, etc., based on the presence of the object 504 and/or the user's grip and/or movement of the object 504. The processing system 510 may then wake the wearable device 502 completely in response to the presence, grasp, and/or movement of the object 504, or begin recording the movement of the object 504 and/or recognize letters written by the user with the object 504, gestures made, and the like. In some implementations, processing system 510 may switch wearable device 502 to a first mode where SMI sensor 508 is used to track movement relative to a desktop or user when object 504 is not detected, and switch wearable device 502 to a second mode where SMI sensor 508 is used to track movement of object 504 when object 504 is detected. In some embodiments, SMI sensor 508 may track the movement of object 504 by tracking the movement of wearable device 502 relative to a desktop or other surface (i.e., a surface other than the surface of object 504). This is because the user's grip on the object 504 may affect the manner in which the user grips their hand or moves their finger, which may indicate the manner in which the user moves the object 504 (e.g., indicates the letter or gesture the user is writing with the object 504). In some cases, wearable device 502 may effectively transform any object (including dumb objects or non-electronic objects) into a smart pen or the like.
In some cases, wearable device 502 may have relatively more SMI sensors 508, such as described with reference to fig. 4. In some cases, object 504 may have one or more SMI sensors 516 therein in addition to wearable device 502 having one or more SMI sensors 508 therein. When provided, SMI sensor 516 may be used similar to SMI sensor 508 included in wearable device 502, and may determine the relationship of object 504 to the wearable device, the user's skin (i.e., the user's surface), or a remote surface (e.g., surface 518). SMI sensor 516 may be positioned along the body of object 504 (e.g., proximate to where the user may hold object 504) or near the tip of object 504 (e.g., proximate to the pointed, written, or painted tip of object 504). In some embodiments, object 504 may include a processing system and/or a communication interface for communicating an SMI signal generated by SMI sensor 516 or information related thereto or derived therefrom to wearable device 502. Alternatively or in addition, the processing system and/or communication interface may receive an SMI signal or information related thereto or derived therefrom from wearable device 502. The wearable device 502 and the object 504 may communicate wirelessly or may be connected by wires, cables, and/or leads. In some implementations, the processing system 510 of the wearable device 502 can bear a majority of the processing burden (e.g., recognize gestures). In other embodiments, the processing system of object 504 may bear a majority of the processing burden, or may share the processing burden. In other embodiments, object 504 may include all system SMI sensors and processing systems.
Fig. 6 shows an example of the system described with reference to fig. 5, where wearable device 502 is a ring and object 504 is shaped as one or more of a stylus, pen, pencil, marker, or brush.
In some cases, the SMI-based gesture input system may include more than one wearable device and/or more than one handheld device. For example, fig. 7 shows an alternative embodiment of the system described with reference to fig. 5, wherein the object 504 is also a wearable device. By way of example, both wearable device 502 and object 504 are shown as rings. For example, rings worn on the user's thumb and forefinger may be used to recognize gestures, such as pinching, zooming, rotating, etc.
An SMI-based gesture input system, such as one of the systems described with reference to fig. 1-7, may be used in some cases to provide input to AR, VR, or MR applications. The SMI-based gesture input system may also be used as an anchor for another system. For example, in a camera-based gesture input system, it is difficult to determine whether a camera or a user's hand (or finger) is moving. The SMI-based gesture input system may replace the camera-based gesture input system or may provide anchoring information to the camera-based gesture input system.
FIG. 8A illustrates a first exemplary SMI sensor 800 that may be used in one or more of the SMI-based gesture input systems described with reference to FIGS. 1-7. In this example, the SMI sensor 800 may include a VCSEL 802 with an integrated resonant cavity (or intra-cavity) photodetector (RCPD) 804.
FIG. 8B illustrates a second exemplary SMI sensor 810 that may be used in one or more of the SMI-based gesture input systems described with reference to FIGS. 1-7. In this example, the SMI sensor 810 may include a VCSEL 812 with an external on-chip RCPD 814. For example, the RCPD 814 may form a disk surrounding the VCSEL 812.
FIG. 8C illustrates a third exemplary SMI sensor 820 that may be used in one or more of the SMI-based gesture input systems described with reference to FIGS. 1-7. In this example, the SMI sensor 820 may include a VCSEL 822 with an external off-chip photodetector 824.
FIG. 8D illustrates a fourth exemplary SMI sensor 830 that may be used in one or more of the SMI-based gesture input systems described with reference to FIGS. 1-7. In this example, the SMI sensor 830 may comprise a dual emission VCSEL 832 with an external off-chip photodetector 834. For example, top emission may be emitted toward the optics and/or another target, and bottom emission may be provided to an external off-chip photodetector 834.
Fig. 9A-9D illustrate different beam shaping or beam steering optics that may be used with any of the SMI sensors described with reference to fig. 1-8D. Fig. 9A illustrates beam shaping optics 900 (e.g., a lens or collimator) that collimates an electromagnetic radiation beam 902 emitted by an SMI sensor 904. The collimated beam may be useful when the range supported by the device is relatively large (e.g., when the device has a range of about ten centimeters). Fig. 9B illustrates beam shaping optics 910 (e.g., a lens) that focus an electromagnetic radiation beam 912 emitted by an SMI sensor 914. Focusing the beam of electromagnetic radiation may be useful when the range supported by the device is limited (e.g., limited to a few centimeters). Fig. 9C shows electromagnetic radiation beam 922 emitted by a set of SMI sensors 924 directed such that the beam converged by beam 922 is diverted to optics 920 (e.g., a lens or a set of lenses). Alternatively, SMI sensor 924 may be configured or oriented such that its beam converges without optics 920. In some embodiments, the beam steering optics 920 may include or be associated with beam shaping optics (such as the beam shaping optics described with reference to fig. 9A or 9B). Fig. 9D shows electromagnetic radiation beam 932 emitted by a set of SMI sensors 934 such that the beam 932 diverges such that the beam is directed toward optics 930 (e.g., a lens or a set of lenses). Alternatively, SMI sensor 934 may be configured or oriented such that its beam is dispersed without optics 930. In some embodiments, the beam steering optics 930 may include or be associated with beam shaping optics (such as the beam shaping optics described with reference to fig. 9A or 9B).
Fig. 10 illustrates a triangulation process 1000 for determining velocity and absolute distance of a surface (or object) using self-mixing interferometry. Process 1000 may be used by one or more of the systems or devices described with reference to fig. 1-7 to modulate an SMI sensor using a triangular waveform, for example, as described with reference to fig. 1.
At an initial stage 1002, an initial signal is generated, such as by a digital or analog signal generator. At stage 1006-1, the generated initial signal is processed as needed to generate a triangle waveform modulation current 1102 (see fig. 11) that is applied to the VCSEL. Stage 1006-1 may be the operation of a DAC (e.g., when the initial signal is the output of a digital step generator), low pass filtering (such as removing quantization noise from the DAC), and voltage-to-current conversion, as desired.
Application of the modulation current 1102 to the VCSEL induces an SMI output 1118 (i.e., a change in the interference characteristics of the VCSEL). For simplicity of discussion, it will be assumed that the SMI output 1118 is from a photodetector, but in other embodiments the SMI output may be from another component.
At initial stage 1004 of FIG. 10, SMI output 1118 is received. At stage 1006-2, initial processing of SMI output 1118 is performed as needed. Stage 1006-2 may include high pass filtering or digital subtraction.
At stage 1008, the processor may equalize the received signals, if desired, to match peak-to-peak, average, root mean square, or any other characteristic value of the signals. For example, the SMI output 1118 may be a primary triangle waveform component that matches the modulation current 1102, with smaller and higher frequency components due to changes in interferometric characteristics. High pass filtering may be applied to the SMI output 1118 to obtain a component signal related to the interferometric characteristic. This stage may also include separating and/or subtracting portions of the SMI output 1118 and the modulation current 1102 corresponding to the rise and fall time intervals of the modulation current 1102. This stage may include sampling the separated information.
At stages 1010 and 1012, a separate Fast Fourier Transform (FFT) is first performed on the portions of the processed SMI output 1118 corresponding to the rise and fall time intervals. The two FFT spectra may be analyzed at stage 1014.
At stage 1016, the FFT spectrum may be further processed, for example, to remove artifacts and reduce noise. Such further processing may include peak detection and gaussian fitting around the detected peaks to improve frequency accuracy. From the processed FFT spectral data, information about absolute distance may be obtained at stage 1018.
Fig. 11 illustrates a block diagram of a system (e.g., a portion or all of the processing system described with reference to fig. 1-7) that may implement the spectral analysis described in the method described above with respect to fig. 10. In the exemplary system shown, the system includes generating an initial digital signal and processing it as needed to produce a modulated current 1102 as an input to the VCSEL 1110. In an illustrative example, the initial step signal may be generated by a number generator to approximate a trigonometric function. The digital output value of the digital generator is used in a digital-to-analog converter (DAC) 1104. The resulting voltage signal may then be filtered by a low pass filter 1106 to remove quantization noise. Alternatively, an integrator-based analog signal generator may be used to directly generate the equivalent voltage signal. The filtered voltage signal is then the input to a voltage-to-current converter 1108 to produce some form of desired modulation current 1102 for input to the VCSEL 1110.
As described above, movement of the target may cause a change in interferometric parameters, such as parameters of the VCSEL 1110 or parameters of a photodetector operating in the system. These changes may be measured to generate the SMI output 1118. In the illustrated embodiment, it will be assumed that the SMI output 1118 is measured by a photodetector. For modulated current 1102 having a triangular waveform, SMI output 1118 may be a triangular wave of similar period combined with smaller and higher frequency signals associated with interferometric properties. In some cases, even though modulation current 1102 is linear, SMI output 1118 may not be perfectly linear. This may be due to the non-linearity of the bias current versus the optical output curve of the VCSEL 1110 (e.g., due to non-idealities such as self-heating effects).
The SMI output 1118 is first passed to a high pass filter 1120 that effectively converts the dominant rising and falling ramp components of the SMI output 1118 to DC offsets. Because the SMI output 1118 may typically be a current, the transimpedance amplifier 1122 may generate a corresponding voltage output (amplified or not) for further processing.
The voltage output may then be sampled and quantized by ADC block 1124. It may be helpful to apply equalization immediately before applying the digital FFT to the output of ADC block 1124. The initial digital signal value from the digital generator used to generate the modulated current 1102 is used as an input to a digital high pass filter 1112 to generate a digital signal associated with the output of the ADC block 1124. The digital variable gain block 1114 may apply an adjustable gain to the output of the digital high pass filter 1112.
The output of the digital variable gain block 1114 serves as one input to a digital equalizer and subtractor block 1116. The other input to the digital equalizer and subtractor block 1116 is the output of the ADC block 1124. The two signals are differential and are used as part of feedback to adjust the gain provided by the digital variable gain block 1114.
Equalization and subtraction may be used to clear the triangle of any remaining artifacts that may be present in the SMI output 1118. For example, if there is a slope error or non-linearity in the SMI output 1118, the digital high pass filter 1112 may not completely eliminate the triangle and artifacts may still be present. In this case, these artifacts may appear as low frequency components after FFT, making peak detection difficult for nearby objects. These artifacts may be partially or completely removed by applying equalization and subtraction.
Once the best correlation is obtained by feedback, the FFT indicated by block 1128 may be applied to the components of the output of ADC block 1124 corresponding to the rising and falling sides of the triangular wave. From the obtained FFT spectrum, the peak frequencies detected on the rising and falling sides may be used to infer absolute distance and/or directional velocity, as described above and indicated by block 1126.
The method just described and its variants involve applying spectral analysis to the SMI output. However, it should be understood that this is an example. In other implementations, alternative methods for determining absolute distance may be obtained directly from the time domain SMI output without applying spectral analysis. Various configurations are possible and contemplated without departing from the scope of this disclosure.
Fig. 12 illustrates a sinusoidal bias process 1200 for determining displacement of a surface (or object) using quadrature demodulation with self-mixing interferometry. Process 1200 may be used by one or more of the systems or devices described with reference to fig. 1-7 to modulate an SMI sensor using a sinusoidal waveform, for example, as described with reference to fig. 1.
As explained in more detail below, fig. 12 shows the components that generate and apply a sinusoidal modulated bias current to the VCSEL. The sinusoidal bias current may generate an output current in the photodetector 1216, depending on the frequency of the sinusoidal bias and the displacement of the structural components of the device. In the circuit of fig. 12, the output current of the photodetector 1216 is digitally sampled and then multiplied by a first sine wave at the original sinusoidal modulation frequency of the bias current and a second sine wave at a frequency twice that original frequency. The two separate multiplied outputs are then each low pass filtered and the phase of the interferometry parameter can be calculated. Thereafter, at least the phase is used to determine the displacement.
The DC voltage generator 1202 is used to generate a constant bias voltage. Sine wave generator 1204 may generate a sine signal of about a single frequency to be combined with a constant voltage. As shown in fig. 12, the sine wave generator 1204 is a digital generator, but in other implementations, the sine wave generator can generate an analog sine wave. The low pass filter 1206-1 provides filtering of the output of the DC voltage generator 1202 to reduce unwanted variations in the constant bias voltage. The band pass filter 1206-2 may be used to reduce distortion and noise in the output of the sine wave generator 1204 to reduce noise, quantization or other distortion, or frequency components whose signals are far from their intended modulation frequency omega m.
The circuit summer 1208 combines the low-pass filtered constant bias voltage with the band-pass filtered sine wave to produce a combined voltage signal on the link 1209, which in the embodiment of fig. 12 has the form V 0+Vmsin(ωm t. The voltage signal is used as an input to a voltage-to-current converter 1210 to generate a current to drive the lasing of the VCSEL 1214. The current on line 1213 from voltage to current converter 1210 may have the form I 0+Imsin(ωm t).
The VCSEL 1214 is thus driven to emit laser light modulated as described above. The reflection of the modulated laser light may then be received back into the laser cavity of the VCSEL 1214 and cause self-mixing interference. The resulting emitted optical power of the VCSEL 1214 may be modified due to self-mixing interference and the modification may be detected by the photodetector 1216. As described above, in this case, the photocurrent output of the photodetector 1216 on link 1215 can have the following form:
Since the I/Q component to be used in the subsequent stage is based on the third term only, the first two terms can be removed or reduced by a differential transimpedance amplifier and anti-aliasing (DTIA/AA) filter 1218. To do this, the scale or scaled value of the first two terms is generated by the voltage divider 1212. The voltage divider 1212 may use as input the combined voltage signal generated by the circuit adder 1208 on the link 1209. The output of voltage divider 1212 on link 1211 may then have the form α (V 0+Vmsin(ωm t)). This output of the photodetector current and voltage divider 1212 may be the input to DTIA/AA filter 1218. The output of DTIA/AA filter 1218 may then be at least largely proportional to the third term of the photodetector current.
The output of DTIA/AA filter 1218 may then be quantized for subsequent computation by ADC block 1220. Further, the output of the ADC block 1220 may have a residual signal component proportional to the sine wave initially generated by the sine wave generator 1204. To filter the residual signal component, the initially generated sine wave may be scaled at multiplier block 1224-3 (such as by multiplying by an indication factor of β) and then subtracted from the output of ADC block 1220 at subtracting block 1222. According to the aboveThe fourier expansion of the term, the filtered output on link 1221, may have the form a+bsin (ω mt)+Ccos(2ωmt)+Dsin(3ωm t) +. The filtered output may then be used to extract the I/Q component by mixing.
Multiplier block 1224-1 mixes (multiplies) the digital sine wave initially generated by sine wave generator 1204 on link 1207 with the filtered output on link 1221. The product is then low pass filtered at block 1228-1 to obtain the Q component described above, possibly after scaling by a number related to the amount of frequency modulation of the laser and the distance to the target.
In addition, the initially generated digital sine wave is used as an input into the squaring/filtering block 1226 to generate a digital cosine wave having a frequency twice that of the initially generated digital sine wave. The digital cosine wave is then mixed (multiplied) with the filtered output of ADC block 1220 on link 1221 at multiplier block 1224-2. The product is then low pass filtered at block 1228-2 to obtain the I component described above, possibly after scaling by a number related to the amount of frequency modulation of the laser and the distance to the target.
The phase calculation section 1230 then uses the Q component and the I component to obtain a phase from which the displacement of the target can be calculated, as described above.
Those skilled in the art will appreciate that while the embodiment shown in fig. 12 utilizes a digital version of the initially generated sine wave generated by sine wave generator 1204 on link 1207, in other embodiments the initially generated sine wave may be an analog signal and mixed with the analog output of DTIA/AA filter 1218. In other embodiments, the voltage divider 1212 may be a variable voltage divider. In other embodiments, the voltage divider 1212 may be omitted and DTIA/AA filter 1218 may be a single-ended DTIA/AA filter. In such implementations, the subtraction may be done digitally only at subtraction block 1222. In other implementations, the subtraction block 1222 may be omitted, and subtraction of the modulation current may not be performed.
The circuit of FIG. 12 may be adapted to implement the modified I/Q method described above using Q'. Alpha. Lowpass { I PD×sin(3ωm t) }. Some such circuit adaptations may include directly generating the mixed signals sin (2ω m t) and sin (3ω m t), and multiplying each mixed signal with the output of the ADC block 1220, and then applying a corresponding low pass filtering, such as by blocks 1228-1, 1228-2. The DTIA/AA filter 1218 may then be replaced by a filter to remove or substantially reduce the entire component of I PD at the initial modulation frequency ω m. Those skilled in the art will recognize other circuit adaptations for implementing the modified I/Q method. For example, the signal sin (3ω m t) may be generated by multiplying the link 1207 with the output of the squaring/filtering block 1226, and then performing band-pass filtering to suppress frequency components other than sin (3ω m t).
In additional and/or alternative embodiments, the I/Q time domain based method just described may be used with the spectrum based method of the first series of embodiments. The first series of spectral methods may be used at a particular time to determine the absolute distance to the target and provide the value L 0. Thereafter, during subsequent time intervals, ΔL may be determined using any of the various I/Q methods just described.
In additional and/or alternative embodiments, a spectral method based on triangular wave modulation of the bias current of the VCSEL can be used as a guide to the I/Q time domain method. In the case of J 1(b)=J2 (b), the I/Q method operates optimally so that the I and Q components have the same amplitude. However, b depends on the distance L. One embodiment may apply triangular wave modulation to the bias current of the VCSEL to determine the distance to the point of interest. This distance is then used to find the optimal peak-to-peak sinusoidal modulation of the bias current for use in the I/Q method. Such dual methods may provide improved signal-to-noise ratio and displacement accuracy obtained according to the I/Q method.
FIG. 13 illustrates an exemplary method 1300 of identifying gesture types. The method 1300 may be performed, for example, by any of the processing systems or processors described herein.
At block 1302, method 1300 may include emitting a beam of electromagnetic radiation from each of a set of one or more SMI sensors disposed in a wearable device. Alternatively, a beam of electromagnetic radiation may be emitted from each SMI sensor in a set of one or more SMI sensors disposed in the handheld device.
At block 1304, method 1300 may include sampling an SMI signal generated by each SMI sensor to generate a time-varying sample stream for each SMI sensor.
At block 1306, method 1300 may include determining movement of the wearable device (or handheld device) relative to the surface using a processor of the wearable device and a time-varying sample stream of at least one SMI sensor of the set of one or more SMI sensors. The operations at block 1306 may also or alternatively include determining a position and/or orientation of the wearable device (or handheld device) relative to the surface.
At block 1308, method 1300 may include transmitting information from the wearable device (or handheld device) to a remote device indicating movement of the wearable device (or handheld device).
In some embodiments, method 1300 may include modulating the input to the SMI sensors (or to each SMI sensor) using a triangular waveform or a sinusoidal waveform. In some embodiments, method 1300 may include 1) modulating an input to an SMI sensor (or to each SMI sensor) using a first type of modulation when generating a first subset of samples in the time-varying sample stream for the SMI sensor, and 2) modulating an input to the SMI sensor (or to each SMI sensor) using a second type of modulation when generating a second subset of samples in the time-varying sample stream for the SMI sensor, wherein the first type of modulation is different from the second type of modulation (e.g., triangle modulation versus sine modulation).
In some embodiments of method 1300, the at least one SMI sensor may comprise three SMI sensors, and determining movement of the wearable device (or handheld device) relative to the surface may comprise determining movement of the wearable device on 6 DoF.
In some embodiments of method 1300, the set of one or more SMI sensors includes a plurality of SMI sensors, and method 1300 may include analyzing a time-varying sample stream generated for the plurality of SMI sensors, and identifying the at least one SMI sensor for determining movement of the wearable device (or handheld device) relative to the surface based at least in part on the analysis.
In some embodiments of method 1300, the at least one SMI sensor may be a first subset of one or more SMI sensors and the surface may be a first surface. In these embodiments, method 1300 may include determining movement of the wearable device (or handheld device) relative to the second surface using a processor of the wearable device (or handheld device) and a time-varying sample stream of the second subset of one or more SMI sensors of the set of one or more SMI sensors.
Fig. 14 illustrates an exemplary electrical block diagram of an electronic device 1400 that may be implemented in some cases as any of the devices described with reference to fig. 1-7 and 13. Electronic device 1400 may include electronic display 1402 (e.g., a light emitting display), processor 1404, power source 1406, memory 1408 or storage device, sensor system 1410, or input/output (I/O) mechanism 1412 (e.g., an input/output device, input/output port, or tactile input/output interface). The processor 1404 may control some or all of the operations of the electronic device 1400. The processor 1404 may communicate directly or indirectly with some or all of the other components of the electronic device 1400. For example, a system bus or other communication mechanism 1414 may provide for communication between the electronic display 1402, the processor 1404, the power supply 1406, the memory 1408, the sensor system 1410, and the I/O mechanism 1412.
The processor 1404 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions, whether in the form of software or firmware or otherwise encoded. For example, the processor 1404 may include a microprocessor, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a controller, or a combination of such devices. As described herein, the term "processor" is intended to encompass a single processor or processing unit, multiple processors, multiple processing units, or one or more other suitably configured computing elements. In some cases, the processor 1404 may provide a portion or all of the processing systems or processors described with reference to fig. 1-7 and 10-13.
It should be noted that the components of the electronic device 1400 may be controlled by multiple processors. For example, select components of the electronic device 1400 (e.g., the sensor system 1410) may be controlled by a first processor and other components of the electronic device 1400 (e.g., the electronic display 1402) may be controlled by a second processor, wherein the first processor and the second processor may or may not be in communication with each other.
The power source 1406 may be implemented with any device capable of providing energy to the electronic device 1400. For example, power source 1406 may include one or more batteries or rechargeable batteries. Additionally or alternatively, the power source 1406 may include a power connector or cord that connects the electronic device 1400 to another power source, such as a wall outlet.
Memory 1408 may store electronic data that may be used by electronic device 1400. For example, the memory 1408 may store electronic data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing signals, control signals, and data structures or databases. Memory 1408 may include any type of memory. By way of example only, the memory 1408 may include random access memory, read-only memory, flash memory, removable memory, other types of storage elements, or a combination of these memory types.
The electronic device 1400 may also include one or more sensor systems 1410 positioned at virtually any location on the electronic device 1400. In some cases, the sensor system 1410 may include one or more SMI sensors positioned as described with reference to any of fig. 1-13. The sensor system 1410 may be configured to sense one or more types of parameters such as, but not limited to, vibration, light, touch, force, heat, movement, relative motion, biometric data of a user (e.g., biometric parameters), air quality, proximity, location, connectivity, and the like. By way of example, the sensor system 1410 may include an SMI sensor, a thermal sensor, a position sensor, a light or optical sensor, an accelerometer, a pressure transducer, a gyroscope, a magnetometer, a health monitoring sensor, an air quality sensor, and the like. Further, the one or more sensor systems 1410 may utilize any suitable sensing technology including, but not limited to, interferometry, magnetic force, capacitance, ultrasound, resistance, optics, sound, ultrasound, piezoelectric, or technology.
The I/O mechanism 1412 may transmit or receive data from a user or another electronic device. The I/O mechanism 1412 may include an electronic display 1402, a touch-sensing input surface, a crown, one or more buttons (e.g., graphical user interface "home" buttons), one or more cameras (including an under-display camera), one or more microphones or speakers, one or more ports such as a microphone port and/or a keyboard. Additionally or alternatively, the I/O mechanism 1412 may transmit electronic signals via a communication interface, such as a wireless, wired, and/or optical communication interface. Examples of wireless and wired communication interfaces include, but are not limited to, cellular and Wi-Fi communication interfaces.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the embodiments. However, it will be apparent to one skilled in the art that the embodiments may be practiced without the specific details after reading this description. Thus, the foregoing descriptions of specific embodiments described herein are presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the embodiments to the precise form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art in light of the above teachings, upon reading this specification.