Abstract:
Apparatuses, methods, and systems are presented for reacting to scene-based occurrences. Such an apparatus may comprise dedicated computer vision (CV) computation hardware configured to receive sensor data from a sensor array comprising a plurality of sensor pixels and capable of computing one or more CV features using readings from neighboring sensor pixels of the sensor array. The apparatus may further comprise a first processing unit configured to control operation of the dedicated CV computation hardware. The first processing unit may be further configured to execute one or more application programs and, in conjunction with execution of the one or more application programs, communicate with at least one input/output (I/O) device controller, to effectuate an I/O operation in reaction to an event generated based on operations performed on the one or more computed CV features.
Abstract:
Methods, systems, computer-readable media, and apparatuses for obtaining vital measurements are presented. The vital measurements may include a blood pressure value that can be obtained by determining a pulse-transit time (PTT) as a function of a photoplethysmography (PPG) measurement and electrocardiogram (ECG) measurement. A mobile device includes an outer body sized to be portable for a user, a processor contained within the outer body, a display coupled to a light guide, and at least one first sensor coupled to the light guide. The display is configured to display an illumination pattern directing light toward blood vessels within the user. The at least one first sensor is configured to measure reflected light from the illumination pattern reflected off of the blood vessels within the user, wherein the processor is configured to obtain a first measurement indicative of changes in blood volume based at least in part on the measured reflected light.
Abstract:
Singleprocessor Vision Sensor System 1350 Queries Peripheral Circuitry —0 1370 Dedicated Processor 1312 1314 r- 1 1340 Visual Sensor Array Unit Face Detection Event A 1330 4 0 1372 1374 110 Device Controller Memory CV Hardware Controller Core Application Processor Core Visual Input FIG. 13B 1-1 00 (12) INTERNATIONAL APPLICATION PUBLISHED UNDER THE PATENT COOPERATION TREATY (PCT) (19) World Intellectual Property Organization International Bureau (43) International Publication Date 26 July 2018 (26.07.2018) WIP0 I PCT omit VIII °nolo 10110111 OH oimIE (10) International Publication Number WO 2018/136325 Al (51) International Patent Classification: GOOK 9/00 (2006.01) GOOK 9 / 5 6 (2006.01) GOOK 9 / 4 6 (2006.01) (21) International Application Number: PCT/US2018/013501 (22) International Filing Date: 12 January 2018 (12.01.2018) (25) Filing Language: English (26) Publication Language: English (30) Priority Data: 15/413,390 23 January 2017 (23.01.2017) US (71) Applicant: QUALCOMM INCORPORATED [US/US]; ATTN: International IP Administration, 5775 Morehouse Drive, San Diego, California 92121-1714 (US). (72) Inventors: GOUSEV, Evgeni; 5775 Morehouse Dri- ve, San Diego, California 92121-1714 (US). GOVIL, Alok; 5775 Morehouse Drive, San Diego, California 92121-1714 (US). MAITAN, Jacek; 5775 Morehouse Dri- ve, San Diego, California 92121-1714 (US). RASQUIN- HA, Nelson; 5775 Morehouse Drive, San Diego, California 92121-1714 (US). RANGAN, Venkat; 5775 Morehouse Drive, San Diego, California 92121-1714 (US). PARK, Ed- win Chongwoo; 5775 Morehouse Drive, San Diego, Cali- fornia 92121-1714 (US). (74) Agent: CHANG, Ko-Fang et al.; Kilpatrick Townsend & Stockton LLP, Mailstop: IP Docketing - 22, 1100 Peachtree Street, N.E., Suite 2800, Atlanta, Georgia 30309 (US). (81) Designated States (unless otherwise indicated, for every kind of national protection available): AE, AG, AL, AM, AO, AT, AU, AZ, BA, BB, BG, BH, BN, BR, BW, BY, BZ, CA, CH, CL, CN, CO, CR, CU, CZ, DE, DJ, DK, DM, DO, (54) Title: SINGLE-PROCESSOR COMPUTER VISION HARDWARE CONTROL AND APPLICATION EXECUTION (57) : Apparatuses, methods, and systems are presented for reacting to scene-based occurrences. Such an apparatus may comprise dedicated computer vision (CV) computation hardware configured to receive sensor data from a sensor array comprising a plurality of sensor pixels and capable of computing one or more CV features using readings from neighboring sensor pixels of the sensor array. The apparatus may further comprise a first processing unit configured to control operation of the dedicated CV computation hardware. The first processing unit may be further configured to execute one or more application programs and, in conjunction with execution of the one or more application programs, communicate with at least one input/output (I/O) device controller, to effectuate an I/O operation in reaction to an event generated based on operations performed on the one or more computed CV features. [Continued on next page] WO 2018/136325 Al MIDEDIMOMOIDEIREEMOMMIMIMMOHOMEHOIS DZ, EC, EE, EG, ES, FI, GB, GD, GE, GH, GM, GT, HN, HR, HU, ID, IL, IN, IR, IS, JO, JP, KE, KG, KH, KN, KP, KR, KW, KZ, LA, LC, LK, LR, LS, LU, LY, MA, MD, ME, MG, MK, MN, MW, MX, MY, MZ, NA, NG, NI, NO, NZ, OM, PA, PE, PG, PH, PL, PT, QA, RO, RS, RU, RW, SA, SC, SD, SE, SG, SK, SL, SM, ST, SV, SY, TH, TJ, TM, TN, TR, TT, TZ, UA, UG, US, UZ, VC, VN, ZA, ZM, ZW. (84) Designated States (unless otherwise indicated, for every kind of regional protection available): ARIPO (BW, GH, GM, KE, LR, LS, MW, MZ, NA, RW, SD, SL, ST, SZ, TZ, UG, ZM, ZW), Eurasian (AM, AZ, BY, KG, KZ, RU, TJ, TM), European (AL, AT, BE, BG, CH, CY, CZ, DE, DK, EE, ES, FI, FR, GB, GR, HR, HU, IE, IS, IT, LT, LU, LV, MC, MK, MT, NL, NO, PL, PT, RO, RS, SE, SI, SK, SM, TR), OAPI (BF, BJ, CF, CG, CI, CM, GA, GN, GQ, GW, KM, ML, MR, NE, SN, TD, TG). Declarations under Rule 4.17: as to applicant's entitlement to apply for and be granted a patent (Rule 4.17(11)) as to the applicant's entitlement to claim the priority of the earlier application (Rule 4.17(iii)) Published: — with international search report (Art. 21(3))
Abstract:
Techniques disclosed herein utilize a vision sensor that integrates a special-purpose camera with dedicated computer vision (CV) computation hardware and a dedicated low-power microprocessor for the purposes of detecting, tracking, recognizing, and/or analyzing subjects, objects, and scenes in the view of the camera. The vision sensor processes the information retrieved from the camera using the included low-power microprocessor and sends "events" (or indications that one or more reference occurrences have occurred, and, possibly, associated data) for the main processor only when needed or as defined and configured by the application. This allows the general-purpose microprocessor (which is typically relatively high-speed and high-power to support a variety of applications) to stay in a low-power (e.g., sleep mode) most of the time as conventional, while becoming active only when events are received from the vision sensor.
Abstract:
Apparatuses, methods, and systems are presented for reacting to scene-based occurrences. Such an apparatus may comprise dedicated computer vision (CV) computation hardware configured to receive sensor data from a sensor array comprising a plurality of sensor pixels and capable of computing one or more CV features using readings from neighboring sensor pixels of the sensor array. The apparatus may further comprise a first processing unit configured to control operation of the dedicated CV computation hardware. The first processing unit may be further configured to execute one or more application programs and, in conjunction with execution of the one or more application programs, communicate with at least one input/output (I/O) device controller, to effectuate an I/O operation in reaction to an event generated based on operations performed on the one or more computed CV features.
Abstract:
Broad band white color can be achieved in MEMS display devices by incorporating a material having an extinction coefficient (k) below a threshold value for wavelength of light within an operative optical range of the interf erometric modulator. One embodiment provides a method of making the MEMS display device comprising depositing said material (23) over at least a portion of a transparent substrate (20), depositing a dielectric layer (24) over the layer of material, forming a sacrificial layer over the dielectric, depositing an electrically conductive layer (14) on the sacrificial layer, and forming a cavity (19) by removing at least a portion of the sacrificial layer. The suitable material may comprise germanium, germanium alloy of various compositions, doped germanium or doped germanium -containing alloys, and may be deposited over the transparent substrate, incorporated within the transparent substrate or the dielectric layer.