Abstract:
A method includes generating a light pattern using a display panel and forming a virtual image from the light pattern utilizing one or more optical components. The virtual image is viewable from a viewing location. The method also includes receiving external light from a real-world environment incident on an optical sensor. The real-world environment is viewable from the viewing location. Further, the method includes obtaining an image of the real-world environment from the received external light, identifying a background feature in the image of the real-world environment over which the virtual image is overlaid, and extracting one or more visual characteristics of the background feature. Additionally, the method includes comparing the one or more visual characteristics to an upper threshold value and a lower threshold value and controlling the generation of the light pattern based on the comparison.
Abstract:
A wearable computing device includes a head-mounted display (HMD) that provides a field of view in which at least a portion of the environment of the wearable computing device is viewable. The HMD is operable to display images superimposed over the field of view. When the wearable computing device determines that a target device is within its environment, the wearable computing device obtains target device information related to the target device. The target device information may include information that defines a virtual control interface for controlling the target device and an identification of a defined area of the target device on which the virtual control image is to be provided. The wearable computing device controls the HMD to display the virtual control image as an image superimposed over the defined area of the target device in the field of view.
Abstract:
Example methods and systems for manipulating and displaying a real-time image and/or photograph on a wearable computing system are disclosed. A wearable computing system may provide a view of a real-world environment of the wearable computing system. The wearable computing system may image at least a portion of the view of the real-world environment in real-time to obtain a real-time image. The wearable computing system may receive at least one input command that is associated with a desired manipulation of the real-time image. The at least one input command may be a hand gesture. Then, based on the at least one received input command, the wearable computing system may manipulate the real-time image in accordance with the desired manipulation. After manipulating the real-time image, the wearable computing system may display the manipulated real-time image in a display of the wearable computing system.
Abstract:
An imaging device includes a first pixel array arrange to capture a first image and a second pixel array arranged to capture a second image. The first pixel array and the second pixel array face substantially a same direction. The imaging device also includes shutter control circuitry which is coupled to the first pixel array to initiate a first exposure period of the first pixel array to capture the first image. The shutter control circuitry is also coupled to the second pixel array to initiate a second exposure period of the second pixel array to capture the second image. The imaging device also includes processing logic coupled to receive first pixel data of the first image and coupled to receive second pixel data of the second image. The processing logic is configured to generate at least one image using the first pixel data and the second pixel data.
Abstract:
An eyepiece for a head mounted display includes an illumination module, an end reflector, a viewing region, and a polarization rotator. The illumination module provides CGI light along a forward propagation path within the eyepiece. The end reflector is disposed at an opposite end of the eyepiece from the illumination module to reflect the CGI light back along a reverse propagation path within the eyepiece. The viewing is disposed between the illumination module and the end reflector and includes an out-coupling polarizing beam splitter ("PBS"). The out- coupling PBS passes the CGI light traveling along the forward propagation path and redirects the CGI light traveling along the reverse propagation path out of an eye- ward side of the eyepiece. The polarization rotator is disposed in the forward and reverse propagation paths between the out-coupling PBS and the end reflector.
Abstract:
An optical system includes a display panel, an image former, a viewing window, a proximal beam splitter, and a distal beam splitter. The display panel is configured to generate a light pattern. The image former is configured to form a virtual image from the light pattern generated by the display panel. The viewing window is configured to allow outside light in from outside of the optical system. The virtual image and the outside light are viewable along a viewing axis extending through the proximal beam splitter. The distal beam splitter is optically coupled to the display panel and the proximal beam splitter and has a beam-splitting interface in a plane that is parallel to the viewing axis. A camera may also be optically coupled to the distal beam splitter so as to be able to receive a portion of the outside light that is viewable along the viewing axis.
Abstract:
A technique for adaptive brightness control of an eyepiece of a head mounted display ("HMD") includes displaying a computer generated image ("CGI") to an eye of a user from a viewing region of the eyepiece of the HMD. Image data is captured from an ambient environment surrounding the HMD. A brightness value is calculated for the ambient environment based at least in part upon the image data. A bias power setting is determined based at least in part upon the brightness value. The bias power setting is applied to an illumination source for generating the CGI and a brightness level of the CGI displayed to the eye of the user is controlled with the bias power setting.
Abstract:
An optical apparatus includes an image source, a scanning mirror, an actuator, and a scanning controller. The image source outputs an image by simultaneously projecting a two-dimensional array of image pixels representing a whole portion of the image. The scanning mirror is positioned in an optical path of the image to reflect the image. The actuator is coupled to the scanning mirror to selectively adjust the scanning mirror about at least one axis. The scanning controller is coupled to the actuator to control a position of the scanning mirror about the at least one axis. The scanning controller includes logic to continuously and repetitiously adjust the position of the scanning mirror to cause the image to be scanned over an eyebox area that is larger than the whole portion of the image.
Abstract:
An eyepiece for a head mounted display includes an illumination module, an end reflector, a viewing region, and a polarization rotator. The illumination module includes an image source for launching computer generated image ("CGI") light along a forward propagating path. The end reflector is disposed at an opposite end of the eyepiece from the illumination module to reflect the CGI back along a reverse propagation path. The viewing region is disposed between the illumination module and the end reflector. The viewing region includes a polarizing beam splitter ("PBS) and non-polarizing beam splitter ("non-PBS") disposed between the PBS and the end reflector. The viewing region redirects the CGI light from the reverse propagation path out of an eye-ward side of the eyepiece. The polarization rotator is disposed in the forward and reverse propagation paths of the CGI light between the viewing region and the end reflector.