Sensors for Electronic Finger Devices
1. A finger device configured to be worn on a finger of a user, comprising:
- a housing configured to be coupled to the finger without covering a lower finger pad surface of the finger;
an ultrasonic sensor coupled to the housing; and
control circuitry configured to gather input from the sensor as the finger moves and configured to provide haptic output to the finger using the haptic output device.
A system may include one or more finger-mounted devices such as finger devices with U-shaped housings configured to be mounted on a user'"'"'s fingers while gathering sensor input and supplying haptic output. The sensors may include strain gauge circuitry mounted on elongated arms of the housing. When the arms move due to finger forces, the strain gauge circuitry can measure the arm movement. The sensors may also include ultrasonic sensors. An ultrasonic sensor may have an ultrasonic signal emitter and a corresponding ultrasonic signal detector configured to detect the ultrasonic signals after passing through a user'"'"'s finger. A two-dimensional ultrasonic sensor may capture ultrasonic images of a user'"'"'s finger pad. Ultrasonic proximity sensors may be used to measure distances between finger devices and external surfaces. Optical sensors and other sensors may also be used in the finger devices.
- 1. A finger device configured to be worn on a finger of a user, comprising:
a housing configured to be coupled to the finger without covering a lower finger pad surface of the finger; an ultrasonic sensor coupled to the housing; and control circuitry configured to gather input from the sensor as the finger moves and configured to provide haptic output to the finger using the haptic output device.
- View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
- 10. A finger device configured to be worn on a finger of a user, comprising:
a U-shaped housing having first and second portions configured to rest respectively on first and second opposing sides of the finger without covering a lower finger pad surface of the finger, wherein the first and second portions have respective first and second elongated arms that rest against the finger; and strain gauge circuitry configured to receive strain measurements as the first and second elongated arms bend.
- View Dependent Claims (11, 12, 13, 14, 15, 16, 17, 18)
- 19. A finger device configured to be worn on a finger of a user, comprising:
a housing having first and second portions configured to rest respectively on first and second opposing sides of the finger while leaving a finger pad surface of the finger exposed; a haptic output device coupled to the housing; and an optical sensor; and control circuitry configured to provide haptic output to the finger using the haptic output device based on measurements from the optical sensor.
- View Dependent Claims (20, 21, 22, 23)
This application claims priority to U.S. provisional patent application No. 62/680,495 filed Jun. 4, 2018, and provisional patent application No. 62/655,050 filed Apr. 9, 2018, which are hereby incorporated by reference herein in its entireties.
This relates generally to electronic devices, and, more particularly, to sensors for finger-mounted electronic devices.
Electronic devices such as computers can be controlled using computer mice and other input accessories. In virtual reality systems, force-feedback gloves can be used to control virtual objects. Cellular telephones may have touch screen displays and vibrators that are used to create haptic feedback in response to touch input.
Devices such as these may not be convenient for a user. For example, computer mice generally require flat surfaces for operation and are mostly used with desktop computers in fixed locations. Force-feedback gloves can be cumbersome and uncomfortable. Touch screen displays with haptic feedback only provide haptic output when a user is interacting with the displays.
A system may include one or more finger-mounted devices such as finger devices with U-shaped housings configured to be mounted on a user'"'"'s fingers while gathering sensor input and supplying haptic output. The sensors may include strain gauge circuitry mounted on elongated arms of the housing. When the arms move due to finger forces, the strain gauge circuitry can measure the arm movement. This allows control circuitry in a finger device to gather information on finger motion and orientation relative to external structures. For example, information can be gathered on whether a user'"'"'s finger has touched an external surface, information on shear forces imposed as a user'"'"'s finger drags along a surface, information on the distance separating a finger from a surface, and other finger information.
In some arrangements a finger device may include ultrasonic sensors. An ultrasonic sensor may have an ultrasonic signal emitter and a corresponding ultrasonic signal detector configured to detect the ultrasonic signals after passing through a user'"'"'s finger. A two-dimensional ultrasonic sensor may capture ultrasonic images of a user'"'"'s finger pad. Ultrasonic proximity sensors may be used to measure distances between finger devices and external surfaces. Optical sensors and other sensors may also be used in the finger devices.
Finger input gathered using one or more finger devices may be provided to ancillary equipment such as electronic equipment with a display and may be used in controlling the operation of the electronic equipment.
Electronic devices that are configured to be mounted on the body of a user may be used to gather user input and to provide a user with output. For example, electronic devices that are configured to be worn on one or more of a user'"'"'s fingers, which are sometimes referred to as finger devices or finger-mounted devices, may be used to gather user input and to supply output. A finger device may, as an example, include an inertial measurement unit with an accelerometer for gathering information on figure motions such as finger taps or free-space finger gestures, may include force sensors for gathering information on normal and shear forces in the finger device and the user'"'"'s finger, and may include other sensors for gathering information on the interactions between the finger device (and the user'"'"'s finger on which the device is mounted) and the surrounding environment. The finger device may include a haptic output device to provide the user'"'"'s finger with haptic output and may include other output components.
One or more finger devices may gather user input from a user. The user may use finger devices in operating a virtual reality or mixed reality device (e.g., head-mounted equipment such as glasses, goggles, a helmet, or other device with a display). During operation, the finger devices may gather user input such as information on interactions between the finger device(s) and the surrounding environment (e.g., interactions between a user'"'"'s fingers and the environment, including finger motions and other interactions associated with virtual content displayed for a user). The user input may be used in controlling visual output on the display. Corresponding haptic output may be provided to the user'"'"'s fingers using the finger devices. Haptic output may be used, for example, to provide the fingers of a user with a desired texture sensation as a user is touching a real object or as a user is touching a virtual object.
Finger devices can be worn on any or all of a user'"'"'s fingers (e.g., the index finger, the index finger and thumb, three of a user'"'"'s fingers on one of the user'"'"'s hands, some or all fingers on both hands, etc.). To enhance the sensitivity of a user'"'"'s touch as the user interacts with surrounding objects, finger devices may have inverted U shapes or other configurations that allow the finger devices to be worn over the top and sides of a user'"'"'s finger tips while leaving the user'"'"'s finger pads exposed. This allows a user to touch objects with the finger pad portions of the user'"'"'s fingers during use. Users can use the finger devices to interact with any suitable electronic equipment. For example, a user may use one or more finger devices to interact with a virtual reality or mixed reality system (e.g., a head-mounted device with a display), to supply input to a desktop computer, tablet computer, cellular telephone, watch, ear buds, or other accessory, or to interact with other electronic equipment.
With one illustrative configuration, which may sometimes be described herein as an example, device 10 is a finger-mounted device having a finger-mounted housing with a U-shaped body that grasps a user'"'"'s finger or a finger-mounted housing with other shapes configured to rest against a user'"'"'s finger and device(s) 24 is a cellular telephone, tablet computer, laptop computer, wristwatch device, head-mounted device, a device with a speaker, or other electronic device (e.g., a device with a display, audio components, and/or other output components).
Devices 10 and 24 may include control circuitry 12 and 26. Control circuitry 12 and 26 may include storage and processing circuitry for supporting the operation of system 8. The storage and processing circuitry may include storage such as nonvolatile memory (e.g., flash memory or other electrically-programmable-read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random-access-memory), etc. Processing circuitry in control circuitry 12 and 26 may be used to gather input from sensors and other input devices and may be used to control output devices. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors and other wireless communications circuits, power management units, audio chips, application specific integrated circuits, etc.
To support communications between devices 10 and 24 and/or to support communications between equipment in system 8 and external electronic equipment, control circuitry 12 may communicate using communications circuitry 14 and/or control circuitry 26 may communicate using communications circuitry 28. Circuitry 14 and/or 28 may include antennas, radio-frequency transceiver circuitry, and other wireless communications circuitry and/or wired communications circuitry. Circuitry 14 and/or 26, which may sometimes be referred to as control circuitry and/or control and communications circuitry, may, for example, support bidirectional wireless communications between devices 10 and 24 over wireless link 38 (e.g., a wireless local area network link, a near-field communications link, or other suitable wired or wireless communications link (e.g., a Bluetooth® link, a WiFi® link, a 60 GHz link or other millimeter wave link, etc.). Devices 10 and 24 may also include power circuits for transmitting and/or receiving wired and/or wireless power and may include batteries. In configurations in which wireless power transfer is supported between devices 10 and 24, in-band wireless communications may be supported using inductive power transfer coils (as an example).
Devices 10 and 24 may include input-output devices such as devices 16 and 30. Input-output devices 16 and/or 30 may be used in gathering user input, in gathering information on the environment surrounding the user, and/or in providing a user with output. Devices 16 may include sensors 18 and devices 24 may include sensors 32. Sensors 18 and/or 32 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors, optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), muscle activity sensors (EMG) for detecting finger actions, radio-frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, and/or other sensors. In some arrangements, devices 10 and/or 24 may use sensors 18 and/or 32 and/or other input-output devices 16 and/or 30 to gather user input (e.g., buttons may be used to gather button press input, touch sensors overlapping displays can be used for gathering user touch screen input, touch pads may be used in gathering touch input, microphones may be used for gathering audio input, accelerometers may be used in monitoring when a finger contacts an input surface and may therefore be used to gather finger press input, etc.).
Devices 16 and/or 30 may include haptic output devices 20 and/or 34. Haptic output devices 20 and/or 34 can produce motion that is sensed by the user (e.g., through the user'"'"'s fingertips). Haptic output devices 20 and/or 34 may include actuators such as electromagnetic actuators, motors, piezoelectric actuators, electroactive polymer actuators, vibrators, linear actuators, rotational actuators, actuators that bend bendable members, actuator devices that create and/or control repulsive and/or attractive forces between devices 10 and/or 24 (e.g., components for creating electrostatic repulsion and/or attraction such as electrodes, components for producing ultrasonic output such as ultrasonic transducers, components for producing magnetic interactions such as electromagnets for producing direct-current and/or alternating-current magnetic fields, permanent magnets, magnetic materials such as iron or ferrite, and/or other circuitry for producing repulsive and/or attractive forces between devices 10 and/or 24). In some situations, actuators for creating forces in device 10 may be used in squeezing a user'"'"'s finger and/or otherwise directly interacting with a user'"'"'s finger pulp. In other situations, these components may be used to interact with each other (e.g., by creating a dynamically adjustable electromagnetic repulsion and/or attraction force between a pair of devices 10 and/or between device(s) 10 and device(s) 24 using electromagnets).
If desired, input-output devices 16 and/or 30 may include other devices 22 and/or 36 such as displays (e.g., in device 24 to display images for a user), status indicator lights (e.g., a light-emitting diode in device 10 and/or 24 that serves as a power indicator, and other light-based output devices), speakers and other audio output devices, electromagnets, permanent magnets, structures formed from magnetic material (e.g., iron bars or other ferromagnetic members that are attracted to magnets such as electromagnets and/or permanent magnets), batteries, etc. Devices 10 and/or 24 may also include power transmitting and/or receiving circuits configured to transmit and/or receive wired and/or wireless power signals.
A user may wear one or more of devices 10 simultaneously. For example, a user may wear a single one of devices 10 on the user'"'"'s ring finger or index finger. As another example, a user may wear a first device 10 on the user'"'"'s thumb, a second device 10 on the user'"'"'s index finger, and an optional third device 10 on the user'"'"'s middle finger. Arrangements in which devices 10 are worn on other fingers and/or all fingers of one or both hands of a user may also be used.
Control circuitry 12 (and, if desired, communications circuitry 14 and/or input-output devices 16) may be contained entirely within device 10 (e.g., in a housing for a fingertip-mounted unit) and/or may include circuitry that is coupled to a fingertip structure (e.g., by wires from an associated wrist band, glove, fingerless glove, etc.). Configurations in which devices 10 have bodies that are mounted on individual user fingertips are sometimes described herein as an example.
As shown in
The sensors in device 10 can measure how forcefully a user is moving device 10 (and finger 40) against surface 48 (e.g., in a direction parallel to the surface normal n of surface 48 such as the -Z direction of
Structure 50 may be a portion of a housing of device 24, may be a portion of another device 10 (e.g., another housing 44), may be a portion of a user'"'"'s finger 40 or other body part, may be a surface of a real-world object such as a table, a movable real-world object such as a bottle or pen, or other inanimate object external to device 10, and/or may be any other structure that the user can contact with finger 40 while moving finger 40 in a desired direction with a desired force. Because motions such as these can be sensed by device 10, device(s) 10 can be used to gather pointing input (e.g., input moving a cursor or other virtual object on a display such as a display in devices 36), can be used to gather tap input, swipe input, pinch-to-zoom input (e.g., when a pair of devices 10 is used), or other gesture input (e.g., finger gestures, hand gestures, arm motions, etc.), and/or other user input.
Portions 52 may, if desired, each be separated from remaining portions of housing 44 by a horizontal slot 48. Portions 52 may have elongated shaped that extend horizontally parallel to longitudinal finger device axis 54. Slots 48 may also extend along axis 54. Due to the presence of slot 48, each portion 52 may bend laterally (e.g., when pressed sideways by finger 40).
Strain gauges or other sensors may be used in measuring the bending of portion 52. For example, a pair of strain gauges 56 may be placed on an area of portion 52 near to the base of slot 48 as shown in
As shown in
As shown in the cross-sectional front view of device 10 of
As shown in the example of
In the example of
In scenarios of the type shown in
If desired, multiple finger devices 10 may interact in system 8 and may be used in providing multi-finger user input to device 24. Electrical components such as ultrasonic sound emitters and ultrasonic sound detectors can be included in these devices to track relative movements between the devices. Consider, as an example, the arrangement of
Ultrasonic sensors such as two-dimensional ultrasonic sensor arrays can be used in gathering ultrasonic image data of finger pulp 40P. This ultrasonic image data can be analyzed to determine the shape of finger 40 and thereby analyze whether finger pulp 40P has contacted surface 48 of structure 50. An illustrative two-dimensional ultrasonic image sensor is shown in
As shown in
As shown in
If desired, a finger device may have a single optical sensor 100. In this arrangement, an optical sensor 100 on one side of finger 40 may emit light and may also detect whether any of the emitted light is reflected from surface 48 and received. The strength of the measured reflected signal may be proportional to the distance between sensor 100 (e.g., device 10 and housing 44) and surface 48 of structure 50. If device 10 is far from surface 48, the reflected signal will be weak. If device 10 is near to surface 48, the reflected signal will be strong.
In configurations in which device 10 contains multiple sensors 100, light emitted from a first of sensors 100 on one side of finger 40 may be measured by a second of sensors 100 on an opposing side of finger 40. When finger 40 is in contact with surface 48, more of the emitted light may be blocked by finger 40 than when finger 40 is not in contact with finger 40. Some light may also be transmitted through pulp 40P, so the amount that finger 40 is compressed against surface 48 may also affect light transmission. As a result, the strength of the emitted light from one sensor 40 that is detected by another sensor 100 may provide information on whether finger 40 is in contact with surface 48, how forcefully finger 40 is pressing against surface 48, and other information about the orientation and motion of finger 40 relative to surface 48.
The optical characteristics of finger 40 (e.g., the outward appearance of finger 40 such as the color of finger 40, the transmittance of finger 40 at one or more different wavelengths of light, etc.) may be monitored using optical sensors 100. As an example, the light emitter 102 on the left of finger 40 may emit light at multiple wavelengths (e.g., one or more infrared wavelengths, one or more visible light wavelengths such as red, green, and blue wavelengths, etc.). The light detector 104 on the right side of finger 40 may be a color light sensor that contains multiple photodetectors that are configured to measure light at different respective wavelengths (e.g., one or more infrared wavelengths, one or more visible light wavelengths such as red, green, and blue wavelengths, or other wavelengths corresponding to the wavelengths of light emitted by emitter 102) and/or light of different colors may be emitted at different times while light detector 104 makes synchronized measurements (e.g., so that the light transmission of finger 40 at each wavelength can be determined). As finger 40 is compressed against surface 48, the color and light transmittance (transmission) of finger 40 will change. The transmission spectrum of finger 40 can be measured dynamically using the multi-wavelength light emitted by light emitter 102 and the color light sensor of detector 104. By analyzing the transmission spectrum (e.g., the color of light transmitted through finger 40), changes in the spectrum (e.g., color changes in finger 40 due to contact with surface 48) can be detected.
If desired, detector 104 may be a color light sensor that monitors the color of finger 40 under exposure to ambient light. Color changes can be detected when finger 40 contacts surface 48 (e.g., finger 40 may appear whiter when compressed against surface 48 so that blood vessels in finger 40 are pinched and contain less blood than when finger 40 is not touching surface 48 and is not compressed). When using ambient light illumination for finger 40, light emitter 102 can be omitted.
Another illustrative configuration measures light transmission through finger 40 at a single wavelength, rather than gathering light transmission data at multiple wavelengths. The amount of light transmission will be affected by finger compression and can therefore be used to detect when finger 40 contacts surface 48. If desired, infrared light, which penetrates into finger 40 more effectively than visible light, may be used (alone or in conjunction with making visible light measurements). For example, an infrared light-emitting diode may be located on the left of finger 40 and a corresponding infrared light detector may be located on the right of finger 40 to monitor infrared light transmission through finger 40.
For measuring the optical characteristics of finger 40, light sensors 100 may be mounted near fingernail 42, where color changes under finger compression are often most evident. For example, one or more light-emitting diodes (visible, infrared, etc.) may emit light into one side of finger 40 near fingernail 42 while detector 104 monitors the amount of this light that is transmitted through finger 40.
Combinations of these arrangements, arrangements in which optical sensing is used to detect occlusion of a light ray traveling under finger 40 as pad 40P touches surface 48, and/or other optical finger sensing arrangements may be used.
In some arrangements, information on devices 10 such as information on finger motion, finger location, finger forces arising from situations in which finger pulp 40 is pressing against external surfaces, and/or other information on the orientation and motion of finger(s) 40 can be gathered using sensing arrangements of more than one type. For example, sensor circuitry for device 10 may include strain gauges, other force sensors, ultrasonic sensors, optical sensors, ultrasonic imaging sensors, ultrasonic sensors measuring finger pulp signal dampening, magnetic sensors, radio-frequency sensors, imaging sensors (e.g., tracking cameras), inertial measurement units (e.g., accelerometers, gyroscopes, and/or compass sensors), touch sensors (e.g., capacitive touch sensors), other sensors, any combination of sensors of one or more, two or more, three or more, or four or more of these sensor types, and/or other finger monitoring arrangements.
During operation of system 8, finger information gathered using one or more finger devices 10 can be used to detect user input. The user input may include user finger gestures including taps, swipes, multi-finger gestures such as pinch-to-zoom gestures, and/or other finger input. In response to finger input gathered with finger devices 10, devices 10 and/or one or more devices that receives the finger information from devices 10 such as electronic device 24 of system 8 of
A perspective view of finger device 10 in an illustrative configuration in which housing 44 of device 10 has a curved trailing edge such as curved rear edge 44E is shown in
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.