Abstract:
An optical assembly including a reflectance-based sensor emitting light into a detection plane and detecting reflections of the emitted light, reflected by an object located in the detection plane, a light redirector positioned away from the sensor redirecting light emitted by the sensor into one or more spatial planes parallel to the detection plane and, when the object is located in the one or more spatial planes, redirecting light reflected by the object into the detection plane, and a processor controlling light emitted by the sensor and receiving outputs from the sensor, and configured such that when an object passes through one or more of the spatial planes, the processor identifies both the spatial planes through which the object passes, and the location of the object within the spatial planes through which it passes, based on the received outputs and the position of the light redirector relative to the sensor.
Abstract:
A proximity sensor including a housing, light emitters mounted in the housing for projecting light out of the housing along a detection plane, light detectors mounted in the housing for detecting amounts of light entering the housing along the detection plane, whereby for each emitter-detector pair (E, D), when an object is located at a target position p(E, D) in the detection plane, corresponding to the pair (E, D), then the light emitted by emitter E is scattered by the object and is expected to be maximally detected by detector D, and a processor to synchronously activate emitter-detector pairs, to read the detected amounts of light from the detectors, and to calculate a location of the object in the detection plane from the detected amounts of light, in accordance with a detection-location relationship that relates detections from emitter-detector pairs to object locations between neighboring target positions in the detection plane.
Abstract:
A flexible touch surface, emitters projecting light beams across the surface such that the beams are incident upon and reflected by the surface when crossing the surface, detectors detecting reflections, by an object on the surface, of projected beams, lenses oriented such that there is a particular angle of entry at which each detector receives a maximal light intensity when beams enter a lens corresponding to the detector at the angle of entry, and such that there are target positions on the surface whereby for each emitter-detector pair, when the object is located at a target position associated with the pair, then light beams emitted by the emitter of the pair are reflected by the object into the lens corresponding to the detector of the pair at the angle of entry, and a processor synchronously co-activating emitter-detector pairs and calculating a location of the object on the surface.
Abstract:
A steering wheel that identifies gestures performed on its surface, including a circular gripping element including a thumb-receiving notch disposed along its circumference, an array of light-based proximity sensors, mounted in the gripping element, that projects light beams through the notch radially outward from the gripping element, and detects light beams reflected back into the gripping element by a moving object at or near the notch, and a processor, coupled with the proximity sensor array, for determining polar angles along the circumference of the gripping element occupied by the object, responsive to light beams projected by the proximity sensor array and reflected back by the object being detected by the proximity sensor array.
Abstract:
An interactive mid-air display including a display that presents a graphical user interface (GUI), optics projecting and rotating the GUI above the display such that the GUI is visible in-air in a plane rotated away from the display, a sensor including light emitters projecting beams towards the projected GUI, light detectors detecting reflections of the beams by objects interacting with the projected GUI, and a lens structure maximizing detection at each detector for light entering the lens structure at a respective location at a specific angle θ, whereby for each emitter-detector pair, maximum detection of light corresponds to the object being at a specific location in the projected GUI, in accordance with the locations of the emitter and detector and the angle θ, and a processor mapping detections of light for emitter-detector pairs to corresponding locations in the projected GUI, and translating the mapped locations to coordinates on the display.
Abstract:
A proximity sensor, including a housing, an array of lenses mounted in the housing, an array of alternating light emitters and light detectors mounted in the housing, each detector being positioned along the image plane of a respective one of the lenses so as to receive maximum light intensity when light enters the lens at a particular angle, an activating unit mounted in the housing and connected to the emitters and detectors, synchronously co-activating each emitter with at least one of the detectors, each activated emitter projecting light out of the housing along a detection plane, and a processor receiving outputs from the detectors corresponding to amounts of projected light reflected by an object in the detection plane to the detectors, and calculating a two-dimensional location of the object in the detection plane based on the detector outputs and the particular angle.
Abstract:
A user interface for a vehicle, including a steering wheel for the vehicle, including a grip, a sensor operable to detect objects at a plurality of locations along the grip, and an illuminator operable to illuminate different portions of the grip, a processor in communication with the sensor, with the illuminator and with a controller of vehicle functions, and a non-transitory computer readable medium storing instructions which cause the processor: to identify, via the sensor, a location of a first object along the grip, to illuminate, via the illuminator, a portion of the grip, adjacent to the identified location, to further identify, via the sensor, a second object being at the illuminated portion of the grip, and to activate, via the controller, a vehicle function in response to the second object identified as being at the illuminated portion of the grip.
Abstract:
A touch screen assembly including a glass screen, LEDs, photo diodes, a transparent plastic frame surrounding the screen, and a light guide that guides light emitted by the LEDs to the photo diodes along light paths that go under the frame on one side, over the screen, and under the frame on the opposite side, and a processor operative to selectively activate LEDs and photo diodes, to identify location of an object touching the screen, based on amounts of light detected by activated photo diodes when light emitted by activated LEDs is blocked along its light path by the object, and to recognize the object touching the frame, based on amounts of light detected by activated photo diodes when light emitted by activated LEDs is absorbed along its light path by the object, thereby providing light-based touch sensitivity to the screen and to the frame.
Abstract:
A vehicle user interface including a vehicle steering wheel including a grip, a sensor mounted in the steering wheel grip detecting objects touching the steering wheel grip, a plurality of individually activatable illumination units illuminating respective locations on the steering wheel grip, and a processor receiving outputs from the sensor, selectively activating a subset of the illumination units adjacent to a detected object, and controlling a plurality of vehicle functions in response to outputs of the sensor.