Abstract:
An image processing device comprising: an image acquisition device; a parallax information acquisition device; and a calculation device configured to calculate a first pixel and a second pixel for each pixel of the acquired image, the first digital filter and the second digital filter corresponding to the parallax information for each pixel of the acquired image, the first digital filter group and the second digital filter group being digital filter groups for giving a parallax to the acquired image and having left-right symmetry to each other, and each of the first digital filter group and the second digital filter group having filter sizes that are different depending on a magnitude of the parallax to be given, wherein the left-right symmetry of the first and second digital filter groups is different between a central part and an edge part of the image.
Abstract:
A single-plate color imaging element where the color filter array includes a basic array pattern with first filters corresponding to a first color and second filters corresponding to a second color with contribution ratios for obtaining luminance signals lower than the first color, the basic array pattern is repeatedly arranged in a diagonal grid shape, one or more first filters are arranged in horizontal, vertical, upper right, and lower right directions of the color filter array, one or more second filters corresponding to each color of the second color are arranged in the upper right and lower right directions of the color filter array in the basic array pattern, and a proportion of the number of pixels of the first color corresponding to the first filters is greater than a proportion of the number of pixels of each color of the second color corresponding to the second filters.
Abstract:
An image pick apparatus includes: an imaging element that includes a first pixel group and a second pixel group that respectively photo-electrically converts luminous fluxes that pass through different areas of a single imaging optical system, generates a first image formed by an output value of the first pixel group and a second image formed by an output value of the second pixel group, and is built therein with a pixel addition unit that adds a pixel value of the first pixel group and a pixel value of the second pixel group to generate a two-dimensional image; and an image processing section that matches a shading shape of the first image and a shading shape of the second image with a shading shape of the two-dimensional image by calculating correction coefficients stored in a memory for the output values of the first pixel group and the second pixel group.
Abstract:
A color filter array includes a basic array pattern P1 constituted by a square array pattern corresponding to 3×3 pixels. In the color filter array, basic array pattern P1 is arranged in a horizontal direction and a vertical direction repeatedly. G filters that are brightness system pixels are arranged at the four corners and the center, that is, arranged on the both diagonal lines. The G filters are in each line of horizontal, vertical, and diagonal directions of the color filter array, and the color filter array includes a square array that corresponds to 2×2 pixels that are constituted by the G filters. A ratio of the number of G pixels that help most to obtain a brightness signal of the basic array pattern P1 is greater than each ratio of the number of R pixels and the number of B pixels that correspond to the color other than G.
Abstract:
Two kinds of color images, obtained by a solid-state imaging device, having different color tones are combined with each other. A first captured color image due to a first pixel group (pixels in which spectral sensitivities of color filters are wide) of the solid-state imaging device is processed. A second captured color image due to a second pixel group (pixels in which spectral sensitivities of color filters are narrow) is processed. The level difference between captured image signals of the pixels of the first pixel group and captured image signals of the pixels of the second pixel group, and due to the spectral sensitivity difference between the color filters in which the spectral sensitivities are wide and narrow is obtained (steps S1 and S2). The level difference is corrected. The first captured color image and the second captured color image are combined with each other.
Abstract:
An imaging element includes a memory that stores first image data obtained by being captured by the imaging element and is incorporated in the imaging element, and a first processor that is configured to perform image data processing on the first image data and is incorporated in the imaging element. The first processor is configured to receive vibration information related to a vibration exerted on the imaging element within a frame output period defined by a first frame rate, and output second image data obtained by assigning the vibration information to a specific position set in the first image data within the frame output period.
Abstract:
An imaging apparatus includes a plurality of imaging elements, at least one signal processing circuit, and a transfer path, in which each of the plurality of imaging elements includes a memory that is incorporated in the imaging element and stores image data obtained by imaging a subject, and a communication interface that is incorporated in the imaging element and outputs output image data based on the image data stored in the memory, the transfer path connects the plurality of imaging elements and a single signal processing circuit in series, and the communication interface of each of the plurality of imaging elements outputs the output image data to an imaging element in a rear stage or the signal processing circuit through the transfer path.
Abstract:
The imaging element has a first and a second phase difference pixel region each including the plurality of phase difference pixels, and an imaging pixel region between the first and the second phase difference pixel region in the first direction. The processor is configured to cause the imaging element to perform imaging at a frame cycle, execute first readout processing of reading out a signal from the first phase difference pixel region during a first frame period, and execute second readout processing of reading out a signal from the second phase difference pixel region during a second frame period subsequent to the first frame period. A first exposure time, during which the first phase difference pixel region is exposed, and a second exposure time, during which the second phase difference pixel region is exposed, are different from an exposure time of the imaging pixel region.
Abstract:
An imaging element incorporates a reading portion that reads out captured image data at a first frame rate, a storage portion that stores the image data, a processing portion that processes the image data, and an output portion that outputs the processed image data at a second frame rate lower than the first frame rate. The reading portion reads out the image data of each of a plurality of frames in parallel. The storage portion stores, in parallel, each image data read out in parallel by the reading portion. The processing portion performs generation processing of generating output image data of one frame using the image data of each of the plurality of frames stored in the storage portion.
Abstract:
An imaging apparatus includes a storage portion that stores image data obtained by imaging, and a processing portion that processes the image data, in which the processing portion performs processing of reducing the image data, performs first detection processing of detecting a specific subject image showing a specific subject from a reduced image indicated by reduced image data obtained by reducing the image data, and in a case where the specific subject image is not detected by the first detection processing, performs second detection processing on pre-reduction image data that is the image data before reduction stored in the storage portion.