-
公开(公告)号:US20250046069A1
公开(公告)日:2025-02-06
申请号:US18715333
申请日:2022-11-30
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Yair Rivenson , Bijie Bai , Hongda Wang
IPC: G06V10/82 , G06V10/143 , G06V20/69
Abstract: A deep learning-based virtual HER2 IHC staining method uses a conditional generative adversarial network that is trained to rapidly transform autofluorescence microscopic images of unlabeled/label-free breast tissue sections into bright-field equivalent microscopic images, matching the standard HER2 IHC staining that is chemically performed on the same tissue sections. The efficacy of this staining framework was demonstrated by quantitative analysis of blindly graded HER2 scores of virtually stained and immunohistochemically stained HER2 whole slide images (WSIs). A second quantitative blinded study revealed that the virtually stained HER2 images exhibit a comparable staining quality in the level of nuclear detail, membrane clearness, and absence of staining artifacts with respect to their immunohistochemically stained counterparts. This virtual staining framework bypasses the costly, laborious, and time-consuming IHC staining procedures in laboratory, and can be extended to other types of biomarkers to accelerate the IHC tissue staining and biomedical workflow.
-
公开(公告)号:US12190478B2
公开(公告)日:2025-01-07
申请号:US17530471
申请日:2021-11-19
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Yair Rivenson , Hongda Wang , Harun Gunaydin , Kevin de Haan
IPC: G06T5/50 , G06N3/08 , G06T3/4046 , G06T3/4053 , G06T3/4076 , G06T5/70 , G06T5/73 , G06T5/92 , G06T7/00
Abstract: A microscopy method includes a trained deep neural network that is executed by software using one or more processors of a computing device, the trained deep neural network trained with a training set of images comprising co-registered pairs of high-resolution microscopy images or image patches of a sample and their corresponding low-resolution microscopy images or image patches of the same sample. A microscopy input image of a sample to be imaged is input to the trained deep neural network which rapidly outputs an output image of the sample, the output image having improved one or more of spatial resolution, depth-of-field, signal-to-noise ratio, and/or image contrast.
-
3.
公开(公告)号:US20240310782A1
公开(公告)日:2024-09-19
申请号:US18546095
申请日:2022-02-09
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Yair Rivenson , Luzhe Huang , Tairan Liu
CPC classification number: G03H1/0866 , G03H1/0005 , G03H1/0443 , G06T5/50 , G06T5/60 , G03H2001/005 , G03H2001/0458 , G03H2001/0883 , G03H2210/55 , G06T2207/10056 , G06T2207/20084 , G06T2207/30024
Abstract: Digital holography is one of the most widely used label-free microscopy techniques in biomedical imaging. Recovery of the missing phase information of a hologram is an important step in holographic image reconstruction. A convolutional recurrent neural network (RNN)-based phase recovery approach is employed that uses multiple holograms, captured at different sample-to-sensor distances to rapidly reconstruct the phase and amplitude information of a sample, while also performing autofocusing through the same trained neural network. The success of this deep learning-enabled holography method is demonstrated by imaging microscopic features of human tissue samples and Papanicolaou (Pap) smears. These results constitute the first demonstration of the use of recurrent neural networks for holographic imaging and phase recovery, and compared with existing methods, the presented approach improves the reconstructed image quality, while also increasing the depth-of-field and inference speed.
-
4.
公开(公告)号:US20240255527A1
公开(公告)日:2024-08-01
申请号:US18563745
申请日:2022-05-24
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Hyou-Arm Joung , Yi Luo
IPC: G01N33/68 , G01N15/075 , G01N33/543 , G06T7/00 , G06V10/762 , G06V10/82 , G06V20/69
CPC classification number: G01N33/6893 , G01N15/075 , G01N33/54388 , G06T7/0012 , G06V10/762 , G06V10/82 , G06V20/698 , G01N2333/4737 , G01N2800/7095 , G06T2207/10056 , G06T2207/20084 , G06T2207/30004 , G06T2207/30204
Abstract: A quantitative particle agglutination assay device is disclosed that combines portable lens-free microscopy and deep learning for rapidly measuring the concentration of a target analyte. As one example of a target analyte, the assay device was used to test for high-sensitivity C-reactive protein (hs-CRP) using human serum samples. A dual-channel capillary lateral flow device is designed to host the agglutination reaction using a small volume of serum. A portable lens-free microscope records time-lapsed inline holograms of the lateral flow device, monitoring the agglutination process over several minutes. These captured holograms are processed, and at each frame the number and area of the particle clusters are automatically extracted and fed into shallow neural networks to predict the CRP concentration. The system can be used to successfully differentiate very high CRP concentrations (e.g., >10-500 μg/mL) from the hs-CRP range.
-
公开(公告)号:US12038370B2
公开(公告)日:2024-07-16
申请号:US17621979
申请日:2020-07-02
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Aniruddha Ray , Yibo Zhang , Dino Di Carlo
IPC: G01N15/1433 , B03C1/01 , B03C1/02 , G01N15/1434 , G03H1/00 , G06V10/147 , G06V10/82 , G06V20/69 , G01N15/01 , G01N15/10
CPC classification number: G01N15/1433 , B03C1/01 , B03C1/02 , G01N15/1434 , G03H1/0005 , G06V10/147 , G06V10/82 , G06V20/693 , G06V20/698 , B03C2201/18 , B03C2201/26 , G01N15/01 , G01N2015/1006 , G03H2001/005 , G03H2222/12
Abstract: A computational cytometer operates using magnetically modulated lensless speckle imaging, which introduces oscillatory motion to magnetic bead-conjugated rare cells of interest through a periodic magnetic force and uses lensless time-resolved holographic speckle imaging to rapidly detect the target cells in three-dimensions (3D). Detection specificity is further enhanced through a deep learning-based classifier that is based on a densely connected pseudo-3D convolutional neural network (P3D CNN), which automatically detects rare cells of interest based on their spatio-temporal features under a controlled magnetic force. This compact, cost-effective and high-throughput computational cytometer can be used for rare cell detection and quantification in bodily fluids for a variety of biomedical applications.
-
6.
公开(公告)号:US20240135544A1
公开(公告)日:2024-04-25
申请号:US18543168
申请日:2023-12-18
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Yair Rivenson , Hongda Wang , Zhensong Wei
IPC: G06T7/11 , G06F18/214 , G06N3/08 , G06V10/764 , G06V10/82 , G16H30/20 , G16H30/40 , G16H70/60
CPC classification number: G06T7/11 , G06F18/2155 , G06N3/08 , G06V10/764 , G06V10/82 , G16H30/20 , G16H30/40 , G16H70/60
Abstract: A deep learning-based digital staining method and system are disclosed that enables the creation of digitally/virtually-stained microscopic images from label or stain-free samples based on autofluorescence images acquired using a fluorescent microscope. The system and method have particular applicability for the creation of digitally/virtually-stained whole slide images (WSIs) of unlabeled/unstained tissue samples that are analyzes by a histopathologist. The methods bypass the standard histochemical staining process, saving time and cost. This method is based on deep learning, and uses, in one embodiment, a convolutional neural network trained using a generative adversarial network model to transform fluorescence images of an unlabeled sample into an image that is equivalent to the brightfield image of the chemically stained-version of the same sample. This label-free digital staining method eliminates cumbersome and costly histochemical staining procedures and significantly simplifies tissue preparation in pathology and histology fields.
-
7.
公开(公告)号:US20230401447A1
公开(公告)日:2023-12-14
申请号:US18316474
申请日:2023-05-12
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Yair Rivenson , Xing Lin , Deniz Mengu , Yi Luo
IPC: G06N3/082 , G02B5/18 , G02B27/42 , G06N3/04 , G06N3/08 , G06V10/94 , G06F18/214 , G06F18/2431
CPC classification number: G06N3/082 , G02B5/1866 , G02B27/4205 , G02B27/4277 , G06N3/04 , G06N3/08 , G06V10/95 , G06F18/214 , G06F18/2431
Abstract: An all-optical Diffractive Deep Neural Network (D2NN) architecture learns to implement various functions or tasks after deep learning-based design of the passive diffractive or reflective substrate layers that work collectively to perform the desired function or task. This architecture was successfully confirmed experimentally by creating 3D-printed D2NNs that learned to implement handwritten classifications and lens function at the terahertz spectrum. This all-optical deep learning framework can perform, at the speed of light, various complex functions and tasks that computer-based neural networks can implement, and will find applications in all-optical image analysis, feature detection and object classification, also enabling new camera designs and optical components that can learn to perform unique tasks using D2NNs. In alternative embodiments, the all-optical D2NN is used as a front-end in conjunction with a trained, digital neural network back-end.
-
公开(公告)号:US20230030424A1
公开(公告)日:2023-02-02
申请号:US17783260
申请日:2020-12-22
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Yair Rivenson , Hongda Wang , Yilin Luo , Kevin de Haan , Yijie Zhang , Bijie Bai
Abstract: A deep learning-based digital/virtual staining method and system enables the creation of digitally/virtually-stained microscopic images from label or stain-free samples. In one embodiment, the method of generates digitally/virtually-stained microscope images of label-free or unstained samples using fluorescence lifetime (FLIM) image(s) of the sample(s) using a fluorescence microscope. In another embodiment, a digital/virtual autofocusing method is provided that uses machine learning to generate a microscope image with improved focus using a trained, deep neural network. In another embodiment, a trained deep neural network generates digitally/virtually stained microscopic images of a label-free or unstained sample obtained with a microscope having multiple different stains. The multiple stains in the output image or sub-regions thereof are substantially equivalent to the corresponding microscopic images or image sub-regions of the same sample that has been histochemically stained.
-
公开(公告)号:US20220113671A1
公开(公告)日:2022-04-14
申请号:US17298182
申请日:2019-12-03
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Aydogan Ozcan , Aniruddha Ray , Mustafa Daloglu
Abstract: A UV holographic imaging device offers a low-cost, portable and robust technique to image and distinguish protein crystals from salt crystals, without the need for any expensive and bulky optical components. This “on-chip” device uses a UV LED and a consumer-grade CMOS image sensor de-capped and interfaced to a processor or microcontroller, the information from the crystal samples, which are placed very close to the sensor active area, is captured in the form of in-line holograms and extracted through digital back-propagation. In these holographic amplitude and/or phase reconstructions, protein crystals appear significantly darker compared to the background due to the strong UV absorption, unlike salt crystals, enabling one to clearly distinguish protein and salt crystals. The on-chip UV holographic microscope serves as a low-cost, sensitive, and robust alternative to conventional lens-based UV-microscopes used in protein crystallography.
-
10.
公开(公告)号:US20210382052A1
公开(公告)日:2021-12-09
申请号:US17285906
申请日:2019-10-18
Applicant: THE REGENTS OF THE UNIVERSITY OF CALIFORNIA
Inventor: Hyou-Arm Joung , Zachary S. Ballard , Omai Garner , Dino Di Carlo , Aydogan Ozcan
IPC: G01N33/569 , B01L3/00
Abstract: A multiplexed vertical flow serodiagnostic testing device for diseases such as Lyme disease includes one or more multi-piece cassettes that include vertical stacks of functionalized porous layers therein. A bottom piece of the cassette includes a sensing membrane with a plurality of spatially multiplexed immunoreaction spots or locations. Top pieces are used to deliver sample and/or buffer solutions along with antibody-conjugated nanoparticles for binding with the immunoreaction spots or locations. A colorimetric signal is generated by the nanoparticles captured on the sensing membrane containing disease-specific antigens. The sensing membrane is imaged by a cost-effective portable reader device. The images captured by the reader device are subject to image processing and analysis to generate positive (+) or negative (−) indication for the sample. A concentration of one or more biomarkers may also be generated. The testing device is rapid, simple, inexpensive, and allows for simultaneous measurement of multiple antibodies and/or antigens making it an ideal point-of-care platform for disease diagnosis.
-
-
-
-
-
-
-
-
-