Interactive Motion-Based Eye Tracking Calibration

    公开(公告)号:US20220214747A1

    公开(公告)日:2022-07-07

    申请号:US17707013

    申请日:2022-03-29

    Applicant: APPLE INC.

    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; Θ; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).

    Interactive motion-based eye tracking calibration

    公开(公告)号:US10976813B2

    公开(公告)日:2021-04-13

    申请号:US16308804

    申请日:2017-06-12

    Applicant: APPLE INC.

    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; Θ; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).

    Interactive Motion-Based Eye Tracking Calibration

    公开(公告)号:US20240069631A1

    公开(公告)日:2024-02-29

    申请号:US18144079

    申请日:2023-05-05

    Applicant: APPLE INC.

    CPC classification number: G06F3/013 G06V40/19 G06V40/193 G06F2203/011

    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; Θ; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).

    Interactive motion-based eye tracking calibration

    公开(公告)号:US11644896B2

    公开(公告)日:2023-05-09

    申请号:US17707013

    申请日:2022-03-29

    Applicant: APPLE INC.

    CPC classification number: G06F3/013 G06V40/19 G06V40/193 G06F2203/011

    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; Θ; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).

    Interactive motion-based eye tracking calibration

    公开(公告)号:US12026309B2

    公开(公告)日:2024-07-02

    申请号:US18144079

    申请日:2023-05-05

    Applicant: APPLE INC.

    CPC classification number: G06F3/013 G06V40/19 G06V40/193 G06F2203/011

    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; Θ; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).

    Interactive motion-based eye tracking calibration

    公开(公告)号:US11320904B2

    公开(公告)日:2022-05-03

    申请号:US17196403

    申请日:2021-03-09

    Applicant: APPLE INC.

    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; Θ; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).

    Interactive Motion-Based Eye Tracking Calibration

    公开(公告)号:US20210223862A1

    公开(公告)日:2021-07-22

    申请号:US17196403

    申请日:2021-03-09

    Applicant: APPLE INC.

    Abstract: The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (12), wherein a stimulus object (S) is displayed within a certain display area (22), such that the stimulus object (S) is at least temporarily moving along a defined trajectory (26) and images of at least one eye (16) of at least one user (18) are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye (16) of the user (18) with respect to the display area (22) are determined. Further, at least one calibration parameter (a1; a2; a3; a4; a5; a6; a7; a8; a9; a10; a11; a12; a13; a14; Θ; R; K; a; b; r) of at least one predefined calibration model (M, M1, M2, M3, M4, M5, M6) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory (26) of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).

Patent Agency Ranking