-
公开(公告)号:US12147631B2
公开(公告)日:2024-11-19
申请号:US18449826
申请日:2023-08-15
Applicant: Neonode Inc.
Inventor: Björn Thomas Eriksson , Sven Robert Pettersson , Stefan Johannes Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Anders Claes Rosengren
Abstract: A method for interacting with controls in a graphical user interface (GUI), including recording user interface gestures performed by a user, for each recorded gesture: when the gesture includes the user virtually touching a specific GUI control, applying the gesture to the specific GUI control; and when the gesture is performed without the user virtually touching a specific GUI control, identifying a particular GUI control that the user is gazing at and applying the gesture to that particular GUI control.
-
公开(公告)号:US10928957B2
公开(公告)日:2021-02-23
申请号:US16775057
申请日:2020-01-28
Applicant: Neonode Inc.
Inventor: Björn Thomas Eriksson , Sven Robert Pettersson , Stefan Johannes Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Anders Claes Rosengren
Abstract: A sensor including lenses, light emitters, each emitter projecting light out of a lens in a particular emission direction along a detection plane, light detectors, each detector detecting maximum light intensity when light enters a lens at a particular detection angle, a table of hotspots, each hotspot corresponding to an emitter-detector pair, the hotspot being a two-dimensional location in the detection plane along the emission direction of the emitter of the pair where projected light reflected by an object placed at that location, enters the lens for the detector of the pair at the detection angle of the detector, and a processor receiving outputs from the detectors corresponding to detected amounts of projected light reflected by an object in the detection plane, and calculating a two-dimensional location of the object in the detection plane based on the received outputs and based on hotspots for synchronously activated emitter-detector pairs.
-
公开(公告)号:US20200167034A1
公开(公告)日:2020-05-28
申请号:US16775057
申请日:2020-01-28
Applicant: Neonode Inc.
Inventor: Björn Thomas Eriksson , Sven Robert Pettersson , Stefan Johannes Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Anders Claes Rosengren
Abstract: A sensor including lenses, light emitters, each emitter projecting light out of a lens in a particular emission direction along a detection plane, light detectors, each detector detecting maximum light intensity when light enters a lens at a particular detection angle, a table of hotspots, each hotspot corresponding to an emitter-detector pair, the hotspot being a two-dimensional location in the detection plane along the emission direction of the emitter of the pair where projected light reflected by an object placed at that location, enters the lens for the detector of the pair at the detection angle of the detector, and a processor receiving outputs from the detectors corresponding to detected amounts of projected light reflected by an object in the detection plane, and calculating a two-dimensional location of the object in the detection plane based on the received outputs and based on hotspots for synchronously activated emitter-detector pairs.
-
公开(公告)号:US11782557B2
公开(公告)日:2023-10-10
申请号:US17176033
申请日:2021-02-15
Applicant: Neonode Inc.
Inventor: Björn Thomas Eriksson , Sven Robert Pettersson , Stefan Johannes Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Anders Claes Rosengren
CPC classification number: G06F3/0421 , G06F3/013 , G06F3/017 , G06F3/041 , G06F3/0412 , G09G5/00 , G06F2203/04103 , G06F2203/04104
Abstract: A method for interacting with controls in a graphical user interface (GUI) having a plurality of GUI controls, the method includes identify one GUI control that a user is gazing at, from among a plurality of GUI controls presented on a display, detect user interface gestures performed by the user, when the detected gesture is performed in an airspace away from the display, apply a relative motion corresponding to the gesture to the identified GUI control, and when the detected gesture is performed by the user touching the display, apply a sequence of absolute display locations touched by the gesture to a GUI control presented at the touched locations that is different than the identified GUI control.
-
公开(公告)号:US10585530B2
公开(公告)日:2020-03-10
申请号:US15616106
申请日:2017-06-07
Applicant: Neonode Inc.
Inventor: Björn Thomas Eriksson , Sven Robert Pettersson , Stefan Johannes Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Anders Claes Rosengren
Abstract: A method for identifying a proximal object, including providing light emitters, light detectors and lenses, mounted in a housing, each lens, denoted L, being positioned in relation to a respective one of the detectors, denoted D, such that light entering lens L is maximally detected at detector D when the light enters lens L at an angle of incidence θ, activating the detectors synchronously with activation of each emitter to measure reflections of the light beams emitted by each emitter, and calculating a location of a reflective object along a path of a light beam projected by an activated emitter, by calculating an axis of symmetry with respect to which the outputs of the synchronously activated detectors are approximately symmetric, orienting the calculated axis of symmetry by the angle θ, and locating a point of intersection of the path of the emitted light beam with the oriented axis of symmetry.
-
公开(公告)号:US20230384896A1
公开(公告)日:2023-11-30
申请号:US18449826
申请日:2023-08-15
Applicant: Neonode Inc.
Inventor: Björn Thomas Eriksson , Sven Robert Pettersson , Stefan Johannes Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Anders Claes Rosengren
CPC classification number: G06F3/0421 , G06F3/041 , G06F3/0412 , G06F3/017 , G06F3/013 , G09G5/00 , G06F2203/04103 , G06F2203/04104
Abstract: A method for interacting with controls in a graphical user interface (GUI), including recording user interface gestures performed by a user, for each recorded gesture: when the gesture includes the user virtually touching a specific GUI control, applying the gesture to the specific GUI control; and when the gesture is performed without the user virtually touching a specific GUI control, identifying a particular GUI control that the user is gazing at and applying the gesture to that particular GUI control.
-
公开(公告)号:US20210181891A1
公开(公告)日:2021-06-17
申请号:US17176033
申请日:2021-02-15
Applicant: Neonode Inc.
Inventor: Björn Thomas Eriksson , Sven Robert Pettersson , Stefan Johannes Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Anders Claes Rosengren
Abstract: A method for interacting with controls in a graphical user interface (GUI), including recording user interface gestures performed by a user, for each recorded gesture: when the gesture includes the user virtually touching a specific GUI control, applying the gesture to the specific GUI control; and when the gesture is performed without the user virtually touching a specific GUI control, identifying a particular GUI control that the user is gazing at and applying the gesture to that particular GUI control.
-
公开(公告)号:US20170269787A1
公开(公告)日:2017-09-21
申请号:US15616106
申请日:2017-06-07
Applicant: Neonode Inc.
Inventor: Thomas Eriksson , Robert Pettersson , Stefan Holmgren , Xiatao Wang , Rozita Teymourzadeh , Per Erik Lindström , Emil Anders Braide , Jonas Daniel Justus Hjelm , Erik Rosengren
Abstract: A user interface system, including a display surface displaying thereon a plurality of controls representing different applications, a multi-faceted housing situated along a single edge of the display surface, including an eye-tracker mounted in a first facet of the housing, the first facet being distal from the display surface, the eye-tracker identifying a control on the display surface at which a user's eyes are directed, and a proximity sensor mounted in a second facet of the housing, the second facet being between the first facet and the display, the proximity sensor detecting a gesture performed by an object opposite the second facet, and a processor running the different applications, connected to the proximity sensor and to the eye-tracker, causing the application represented by the control identified by the eye-tracker to receive as input the gesture detected by the proximity sensor.
-
-
-
-
-
-
-