Abstract:
Various embodiments of the present technology may comprise methods and apparatus for data compression and transmission. Embodiments the present technology transmit relevant data sub-cubes and compress and transmit non-relevant data sub-cubes. Relevant data sub-cubes may be those sub-cubes that contain detected target data and sub-cubes that are directly adjacent to the detected target data. Data contained in the directly adjacent sub-cubes that are overlapping/shared are only transmitted once.
Abstract:
Various embodiments of the present technology comprise a method and apparatus for high dynamic range imaging. According to various embodiments, the method and apparatus for high dynamic range imaging is configured to select the “best” signal for each pixel location to construct an HDR output. In one embodiment, the “best” signal is select among various pixel signals and the pixel signal selected as the “best” signal is based on the value of the pixel signals and identification of non-saturated. If only one pixel value for a particular pixel location is non-saturated, then the “best” signal is the non-saturated signal. If more than one pixel value is non-saturated, then the “best” signal is the pixel signal with the greatest value. If two more pixel values are non-saturated and both have equally great values, then the “best” signal is the pixel signal with the shortest integration time.
Abstract:
An image sensor may have an array of image sensor pixels arranged in color filter unit cells each having one red image pixel that generates red image signals, one blue image pixel that generate blue image signals, and two clear image sensor pixels that generate white image signals. The image sensor may be coupled to processing circuitry that performs filtering operations on the red, blue, and white image signals to increase noise correlations in the image signals that reduce noise amplification when applying a color correction matrix to the image signals. The processing circuitry may extract a green image signal from the white image signal. The processing circuitry may compute a scaling value that includes a linear combination of the red, blue, white and green image signals. The scaling value may be applied to the red, blue, and green image signals to produce corrected image signals having improved image quality.
Abstract:
An image sensor may have an array of pixels that include nested sub-pixels that each have at least one respective photodiode. An inner sub-pixel of a pixel with nested sub-pixels may have a relatively lower effective light collecting area compared to an outer sub-pixel of the pixel within which the inner sub-pixel is nested. A pixel circuit for the nested sub-pixels may include an overflow capacitor and/or a coupled gate circuit used to route charges from the photodiode in the inner sub-pixel. The lower light collecting area of the photodiode in the inner sub-pixel, with optional flicker mitigation charge routing from the coupled gates structure, may reduce the size of the capacitors required to capture photodiode and photodiode overflow charge responses. Flicker mitigation charge routing using a coupled gates structure may allow an adjustable proportion of the overflow charge to be stored in one or more storage capacitors.
Abstract:
An image sensor may include an array of image photodiodes formed in rows and columns. The array of image photodiodes may include a region of photodiodes arranged in three adjacent rows and three adjacent columns of the array. The region of photodiodes may include four non-adjacent photodiodes, each of which generates charge in response to the same color of light. The four non-adjacent photodiodes may be coupled to a shared floating diffusion node. Each of the four non-adjacent photodiodes may transfer generated charge to the shared floating diffusion node. The charges from each of the four non-adjacent photodiodes may be summed at the shared floating diffusion node and read out as a summed signal or may be individually transferred to the shared floating diffusion node and read out individually.
Abstract:
A pixel may include an inner sub-pixel group and an outer sub-pixel group. The inner sub-pixel group may have a smaller light collecting area than the outer sub-pixel group and therefore be less sensitive to light than the outer sub-pixel group. This may enable the pixel to be used to generate high dynamic range images, even with the sub-pixel groups using the same length integration time. The inner sub-pixel group may be nested within the outer sub-pixel group. Additionally, one or both of the inner sub-pixel group and the outer sub-pixel group can be split into at least two sub-pixels so that the sub-pixel group can be used to gather phase detection data. Adjacent pixels may have sub-pixel groups split in different directions to enable detection of vertical and horizontal edges in a scene.
Abstract:
An image sensor may include pixels having nested sub-pixels. A pixel with nested sub-pixels may include an inner sub-pixel that has either an elliptical or a rectangular light collecting area. The inner sub-pixel may be formed in a substrate and may be immediately surrounded by a sub-pixel group that includes one or more sub-pixels. The inner sub-pixel may have a light collecting area at a surface that is less sensitive than the light collecting area of the one or more outer sub-pixel groups. Microlenses may be formed over the nested sub-pixels, to direct light away from the inner sub-pixel group to the outer sub-pixel groups in nested sub-pixels. A color filter of a single color may be formed over the nested sub-pixels. Hybrid color filters having a single color filter region over the inner sub-pixel and a portion of the one or more outer sub-pixel groups may also be used.
Abstract:
An image sensor may include a pixel array with high dynamic range functionality and phase detection pixels. The phase detection pixels may be arranged in phase detection pixel groups. Each phase detection pixel group may include three adjacent pixels arranged consecutively in a line. A single microlens may cover all three pixels in the phase detection pixel group, or two microlenses may combine to cover the three pixels in the phase detection pixel group. The edge pixels in each phase detection pixel group may have the same integration time and the same color. The middle pixel in each phase detection pixel group may have the same or different color as the edge pixels, and the same or different integration time as the edge pixels. Phase detection pixel groups may also be formed from two pixels that each are 1.5 times the size of neighboring pixels.
Abstract:
An image sensor may include an array of image sensor pixels. Each image sensor pixel may have signal storage capabilities implemented through a write-back supply line and a control transistor for the supply line. Each image sensor pixel may output pixel values over column lines to switching circuitry. The switching circuitry may route the pixel values to signal processing circuitry. The signal processing circuitry may perform analog and/or digital processing operations utilizing analog circuits or pinned diode devices for image signal processing on the pixel values to output processed pixel values. The processing circuitry may send the processed pixel values back to the array. This allows the array to act as memory circuitry to support processing operations on processing circuitry in close proximity to the array. Configured this way, signal processing can be performed in close proximity to the array without having to move pixel signals to peripheral processing circuitry.
Abstract:
An image sensor may have an array of image sensor pixels arranged in color filter unit cells each having one red image pixel that generates red image signals, one blue image pixel that generate blue image signals, and two clear image sensor pixels that generate white image signals. The image sensor may be coupled to processing circuitry that performs filtering operations on the red, blue, and white image signals to increase noise correlations in the image signals that reduce noise amplification when applying a color correction matrix to the image signals. The processing circuitry may extract a green image signal from the white image signal. The processing circuitry may compute a scaling value that includes a linear combination of the red, blue, white and green image signals. The scaling value may be applied to the red, blue, and green image signals to produce corrected image signals having improved image quality.