Abstract:
A handheld pointing device includes a main body, an image sensing module, an acceleration sensing module and a processing circuit. The image sensing module is disposed in the main body and configured to capture an image comprising at least one reference light source and accordingly generate an optical sensing signal. The acceleration sensing module is disposed in the main body and configured to sense an acceleration value in each one of two dimensions; wherein the acceleration sensing module outputs an acceleration sensing signal if an absolute value of the summation of the two acceleration values in two dimensions is located within a predetermined acceleration range. The processing circuit is configured to receive the optical sensing signal and the acceleration sensing signal and accordingly generate an output signal.
Abstract:
An interactive system includes a display, a processor and a remote controller. The display includes at least one reference beacon for providing light with a predetermined feature. The remote controller includes an image sensor configured to capture an image containing the reference beacon and calculates an aiming coordinate according to an imaging position of the reference beacon in the captured image. The processor calculates a scale ratio of a pixel size of the display with respect to that of the image captured by the image sensor and moves a cursor position according to the scale ratio and the aiming coordinate.
Abstract:
There is provided an image positioning method including the steps of: capturing an image frame with an image sensor; identifying at least on object image in the image frame; comparing an object image size of the object image with a size threshold and identifying the object image having the object image size larger than the size threshold as a reference point image; and positioning the reference point image. There is further provided an interactive imaging system.
Abstract:
An optical touch control method includes steps of: providing a bright background from at least one edge of a touch surface in a first period; providing illumination light to the touch surface in a second period; capturing a first image of an indicator object blocking a portion of the bright background in the first period; and capturing a second image of the indicator object reflecting the illumination light in the second period. An optical touch system is also provided.
Abstract:
There is provided an interactive system, which includes a remote controller. The remote controller is equipped with a camera to capture an operating frame having a user image and a background image therein; and a processing unit to analyze the operating frame to identify a user image section and a background image section within the operating frame corresponding to the user image and the background image respectively, wherein the processing unit generates a movement information of the remote controller according to intensity distributions of the user image section and the background image section.
Abstract:
There is provided an interactive system, which includes a remote controller. The remote controller is equipped with a camera to capture an operating frame having a user image and a background image therein; and a processing unit to analyze the operating frame to identify a user image section and a background image section within the operating frame corresponding to the user image and the background image respectively, wherein the processing unit generates a movement information of the remote controller according to intensity distributions of the user image section and the background image section.
Abstract:
There is provided a gesture detection device including two linear image sensor arrays and a processing unit. The processing unit is configured to compare sizes of pointer images in the image frames captured by the two linear image sensor arrays in the same period or different periods so as to identify a click event.
Abstract:
An optical object recognition system includes at least two beacons, an image sensor and a processing unit. The beacons operate in an emission pattern and the emission pattern of the beacons has a phase shift from each other. The image sensor captures image frames with a sampling period. The processing unit is configured to recognize different beacons according to the phase shift of the emission pattern in the image frames.