Abstract:
A system for use in a vehicle, including a steering element situated opposite a driver seat in a vehicle, the steering element including a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element, an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle, and a processor housed in the vehicle, coupled with the proximity sensors and the deck, operable to identify the hand gestures detected by the proximity sensors, and to control the deck in response to thus-identified hand gestures.
Abstract:
A graphics tablet system including a housing, a touch screen in the housing for receiving touch input, and for displaying graphics corresponding to the received touch input, a plurality of styli for performing touch input, each stylus including an RFID chip storing one or more graphic attributes, and a visible indicator of the one or more graphic attributes, an RFID reader in the housing for reading the stored one or more graphic attributes from a stylus touching the touch screen, and a processor in the housing, connected to the touch screen and to the RFID reader, for rendering a drawing on the touch screen according to the motion of the stylus on the touch screen and according to one or more graphic attributes read by the RFID reader from the stylus, wherein the plurality of styli store different one or more graphic attributes.
Abstract:
A vehicle autonomous drive system including a steering wheel, a sensor operable to identify each gesture component within a set of gesture components performed on the steering wheel by a driver of the vehicle, the set of gesture components including thumb-tap, thumb touch-and-hold, thumb-glide, hand-grab and hand-tap, a processor for an autonomous drive system in the vehicle, receiving from the sensor, a series of time-stamped, contact coordinates for the gesture components identified by the sensor, and a non-transitory computer readable medium storing instructions thereon that, when executed by the processor, cause the processor to construct compound gestures based on the series of time-stamped, contact coordinates, and to activate features of the autonomous drive system in response to the compound gestures.
Abstract:
A sensor, including light emitters projecting directed light beams, light detectors interleaved with the light emitters, lenses, each lens oriented relative to a respective one of the light detectors such that the light detector receives maximum intensity when light enters the lens at an angle b, whereby, for each emitter E, there exist corresponding target positions p(E, D) along the path of the light from emitter E, at which an object located at any of the target positions reflects the light projected by emitter E towards a respective one of detectors D at angle b, and a processor storing a reflection value R(E, D) for each co-activated emitter-detector pair (E, D), based on an amount of light reflected by an object located at p(E, D) and detected by detector D, and calculating a location of an object based on the reflection values and target positions.
Abstract:
A user interface for a vehicle, including a steering wheel for the vehicle, including a grip, a sensor operable to detect objects at a plurality of locations along the grip, and an illuminator operable to illuminate different portions of the grip, a processor in communication with the sensor, with the illuminator and with a controller of vehicle functions, and a non-transitory computer readable medium storing instructions which cause the processor to identify, via the sensor, a location of a first object along the grip, to illuminate, via the illuminator, a portion of the grip, adjacent to the identified location, to further identify, via the sensor, a second object being at the illuminated portion of the grip, and to activate, via the controller, a vehicle function in response to the second object being at the illuminated portion of the grip.
Abstract:
A computer readable medium storing instructions which cause a processor to generate data structures for an object moving along the perimeter of a curved touch-sensitive user input device, each data structure corresponding to a gesture and including a time stamp, polar angles at which the object starts and ends, a middle polar angle of the object, and an assigned state being one of the group RECOGNIZED, UPDATED and ENDED, wherein the instructions cause the processor to assign the RECOGNIZED state to the data structure when the moving object is initially detected on the perimeter of the device, to assign the UPDATED state to the data structure when the moving object is further detected on the perimeter of the device after the initial detection, and to assign the ENDED state to the data structure when the moving object ceases to be detected on the perimeter of the device.
Abstract:
A graphics tablet system including a housing, a touch screen in the housing for receiving touch input, and for displaying graphics corresponding to the received touch input, a plurality of styli for performing touch input, each stylus including an RFID chip storing one or more graphic attributes, and a visible indicator of the one or more graphic attributes, an RFID reader in the housing for reading the stored one or more graphic attributes from a stylus touching the touch screen, and a processor in the housing, connected to the touch screen and to the RFID reader, for rendering a drawing on the touch screen according to the motion of the stylus on the touch screen and according to one or more graphic attributes read by the RFID reader from the stylus, wherein the plurality of styli store different one or more graphic attributes.
Abstract:
A system for use in a vehicle, including a steering element situated opposite a driver seat in a vehicle, the steering element including a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element, an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle, and a processor housed in the vehicle, coupled with the proximity sensors and the deck, operable to identify the hand gestures detected by the proximity sensors, and to control the deck in response to thus-identified hand gestures.
Abstract:
A system for use in a vehicle, including a steering element situated opposite a driver seat in a vehicle, the steering element including a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element, an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle, and a processor housed in the vehicle, coupled with the proximity sensors and the deck, operable to identify the hand gestures detected by the proximity sensors, and to control the deck in response to thus-identified hand gestures.
Abstract:
A system for use in a vehicle, including a steering element situated opposite a driver seat in a vehicle, the steering element including a plurality of proximity sensors encased in the periphery of the steering element operable to detect hand gestures along the outer periphery of the steering element, an interactive deck housed in the vehicle, for providing at least one of radio broadcast, video broadcast, audio entertainment, video entertainment and navigational assistance in the vehicle, and a processor housed in the vehicle, coupled with the proximity sensors and the deck, operable to identify the hand gestures detected by the proximity sensors, and to control the deck in response to thus-identified hand gestures.