• 제목/요약/키워드: Computer Aided Diagnostic

검색결과 53건 처리시간 0.019초

Effects of Implementing Artificial Intelligence-Based Computer-Aided Detection for Chest Radiographs in Daily Practice on the Rate of Referral to Chest Computed Tomography in Pulmonology Outpatient Clinic

  • Wonju Hong;Eui Jin Hwang;Chang Min Park;Jin Mo Goo
    • Korean Journal of Radiology
    • /
    • 제24권9호
    • /
    • pp.890-902
    • /
    • 2023
  • Objective: The clinical impact of artificial intelligence-based computer-aided detection (AI-CAD) beyond diagnostic accuracy remains uncertain. We aimed to investigate the influence of the clinical implementation of AI-CAD for chest radiograph (CR) interpretation in daily practice on the rate of referral for chest computed tomography (CT). Materials and Methods: AI-CAD was implemented in clinical practice at the Seoul National University Hospital. CRs obtained from patients who visited the pulmonology outpatient clinics before (January-December 2019) and after (January-December 2020) implementation were included in this study. After implementation, the referring pulmonologist requested CRs with or without AI-CAD analysis. We conducted multivariable logistic regression analyses to evaluate the associations between using AI-CAD and the following study outcomes: the rate of chest CT referral, defined as request and actual acquisition of chest CT within 30 days after CR acquisition, and the CT referral rates separately for subsequent positive and negative CT results. Multivariable analyses included various covariates such as patient age and sex, time of CR acquisition (before versus after AI-CAD implementation), referring pulmonologist, nature of the CR examination (baseline versus follow-up examination), and radiology reports presence at the time of the pulmonology visit. Results: A total of 28546 CRs from 14565 patients (mean age: 67 years; 7130 males) and 25888 CRs from 12929 patients (mean age: 67 years; 6435 males) before and after AI-CAD implementation were included. The use of AI-CAD was independently associated with increased chest CT referrals (odds ratio [OR], 1.33; P = 0.008) and referrals with subsequent negative chest CT results (OR, 1.46; P = 0.005). Meanwhile, referrals with positive chest CT results were not significantly associated with AI-CAD use (OR, 1.08; P = 0.647). Conclusion: The use of AI-CAD for CR interpretation in pulmonology outpatients was independently associated with an increased frequency of overall referrals for chest CT scans and referrals with subsequent negative results.

Support Vector Machine Based Diagnostic System for Thyroid Cancer using Statistical Texture Features

  • Gopinath, B.;Shanthi, N.
    • Asian Pacific Journal of Cancer Prevention
    • /
    • 제14권1호
    • /
    • pp.97-102
    • /
    • 2013
  • Objective: The aim of this study was to develop an automated computer-aided diagnostic system for diagnosis of thyroid cancer pattern in fine needle aspiration cytology (FNAC) microscopic images with high degree of sensitivity and specificity using statistical texture features and a Support Vector Machine classifier (SVM). Materials and Methods: A training set of 40 benign and 40 malignant FNAC images and a testing set of 10 benign and 20 malignant FNAC images were used to perform the diagnosis of thyroid cancer. Initially, segmentation of region of interest (ROI) was performed by region-based morphology segmentation. The developed diagnostic system utilized statistical texture features derived from the segmented images using a Gabor filter bank at various wavelengths and angles. Finally, the SVM was used as a machine learning algorithm to identify benign and malignant states of thyroid nodules. Results: The SVMachieved a diagnostic accuracy of 96.7% with sensitivity and specificity of 95% and 100%, respectively, at a wavelength of 4 and an angle of 45. Conclusion: The results show that the diagnosis of thyroid cancer in FNAC images can be effectively performed using statistical texture information derived with Gabor filters in association with an SVM.

A Computer-Aided Diagnosis of Brain Tumors Using a Fine-Tuned YOLO-based Model with Transfer Learning

  • Montalbo, Francis Jesmar P.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제14권12호
    • /
    • pp.4816-4834
    • /
    • 2020
  • This paper proposes transfer learning and fine-tuning techniques for a deep learning model to detect three distinct brain tumors from Magnetic Resonance Imaging (MRI) scans. In this work, the recent YOLOv4 model trained using a collection of 3064 T1-weighted Contrast-Enhanced (CE)-MRI scans that were pre-processed and labeled for the task. This work trained with the partial 29-layer YOLOv4-Tiny and fine-tuned to work optimally and run efficiently in most platforms with reliable performance. With the help of transfer learning, the model had initial leverage to train faster with pre-trained weights from the COCO dataset, generating a robust set of features required for brain tumor detection. The results yielded the highest mean average precision of 93.14%, a 90.34% precision, 88.58% recall, and 89.45% F1-Score outperforming other previous versions of the YOLO detection models and other studies that used bounding box detections for the same task like Faster R-CNN. As concluded, the YOLOv4-Tiny can work efficiently to detect brain tumors automatically at a rapid phase with the help of proper fine-tuning and transfer learning. This work contributes mainly to assist medical experts in the diagnostic process of brain tumors.

Effect of a Deep Learning Framework-Based Computer-Aided Diagnosis System on the Diagnostic Performance of Radiologists in Differentiating between Malignant and Benign Masses on Breast Ultrasonography

  • Ji Soo Choi;Boo-Kyung Han;Eun Sook Ko;Jung Min Bae;Eun Young Ko;So Hee Song;Mi-ri Kwon;Jung Hee Shin;Soo Yeon Hahn
    • Korean Journal of Radiology
    • /
    • 제20권5호
    • /
    • pp.749-758
    • /
    • 2019
  • Objective: To investigate whether a computer-aided diagnosis (CAD) system based on a deep learning framework (deep learning-based CAD) improves the diagnostic performance of radiologists in differentiating between malignant and benign masses on breast ultrasound (US). Materials and Methods: B-mode US images were prospectively obtained for 253 breast masses (173 benign, 80 malignant) in 226 consecutive patients. Breast mass US findings were retrospectively analyzed by deep learning-based CAD and four radiologists. In predicting malignancy, the CAD results were dichotomized (possibly benign vs. possibly malignant). The radiologists independently assessed Breast Imaging Reporting and Data System final assessments for two datasets (US images alone or with CAD). For each dataset, the radiologists' final assessments were classified as positive (category 4a or higher) and negative (category 3 or lower). The diagnostic performances of the radiologists for the two datasets (US alone vs. US with CAD) were compared Results: When the CAD results were added to the US images, the radiologists showed significant improvement in specificity (range of all radiologists for US alone vs. US with CAD: 72.8-92.5% vs. 82.1-93.1%; p < 0.001), accuracy (77.9-88.9% vs. 86.2-90.9%; p = 0.038), and positive predictive value (PPV) (60.2-83.3% vs. 70.4-85.2%; p = 0.001). However, there were no significant changes in sensitivity (81.3-88.8% vs. 86.3-95.0%; p = 0.120) and negative predictive value (91.4-93.5% vs. 92.9-97.3%; p = 0.259). Conclusion: Deep learning-based CAD could improve radiologists' diagnostic performance by increasing their specificity, accuracy, and PPV in differentiating between malignant and benign masses on breast US.

Computer-Aided Detection with Automated Breast Ultrasonography for Suspicious Lesions Detected on Breast MRI

  • Kim, Sanghee;Kang, Bong Joo;Kim, Sung Hun;Lee, Jeongmin;Park, Ga Eun
    • Investigative Magnetic Resonance Imaging
    • /
    • 제23권1호
    • /
    • pp.46-54
    • /
    • 2019
  • Purpose: The aim of this study was to evaluate the diagnostic performance of a computer-aided detection (CAD) system used with automated breast ultrasonography (ABUS) for suspicious lesions detected on breast MRI, and CAD-false lesions. Materials and Methods: We included a total of 40 patients diagnosed with breast cancer who underwent ABUS (ACUSON S2000) to evaluate multiple suspicious lesions found on MRI. We used CAD ($QVCAD^{TM}$) in all the ABUS examinations. We evaluated the diagnostic accuracy of CAD and analyzed the characteristics of CAD-detected lesions and the factors underlying false-positive and false-negative cases. We also analyzed false-positive lesions with CAD on ABUS. Results: Of a total of 122 suspicious lesions detected on MRI in 40 patients, we excluded 51 daughter nodules near the main breast cancer within the same quadrant and included 71 lesions. We also analyzed 23 false-positive lesions using CAD with ABUS. The sensitivity, specificity, positive predictive value, and negative predictive value of CAD (for 94 lesions) with ABUS were 75.5%, 44.4%, 59.7%, and 62.5%, respectively. CAD facilitated the detection of 81.4% (35/43) of the invasive ductal cancer and 84.9% (28/33) of the invasive ductal cancer that showed a mass (excluding non-mass). CAD also revealed 90.3% (28/31) of the invasive ductal cancers measuring larger than 1 cm (excluding non-mass and those less than 1 cm). The mean sizes of the true-positive versus false-negative mass lesions were $2.08{\pm}0.85cm$ versus $1.6{\pm}1.28cm$ (P < 0.05). False-positive lesions included sclerosing adenosis and usual ductal hyperplasia. In a total of 23 false cases of CAD, the most common (18/23) cause was marginal or subareolar shadowing, followed by three simple cysts, a hematoma, and a skin wart. Conclusion: CAD with ABUS showed promising sensitivity for the detection of invasive ductal cancer showing masses larger than 1 cm on MRI.

심한 교모와 구치부 상실을 보이는 환자의 전악 수복: Jaw motion tracking과 digital workflow를 활용한 증례 보고 (Full-mouth rehabilitation of severely attrited dentition with missing posterior teeth: a case report using digital workflow with jaw motion tracking)

  • 박찬영;이영후;홍성진;백장현;노관태;배아란;김형섭;권긍록
    • 대한치과보철학회지
    • /
    • 제61권4호
    • /
    • pp.293-307
    • /
    • 2023
  • 최근 다양한 증례 보고에서 소개되고 있는 jaw motion tracking은 환자의 안궁 이전 및 개별화된 하악 운동 경로를 기록한 후, 이를 computer-aided-design/computer-aided-manufacturing (CAD-CAM) 소프트웨어의 가상공간상에 재현하는 방법이다. 본 증례의 환자는 오랜 기간 구치부의 상실로 인해 교합평면의 붕괴가 관찰되었기에, 수직 교합 고경의 증가를 동반한 완전 구강 회복술을 계획하였다. 우선 jaw motion tracking을 진행하여 새로운 중심위 상에서 환자의 하악 운동을 기록한 후, 이 정보를 환자의 초진 구내 데이터 및 3차원 안면 스캔 데이터와 조합하여 가상 환자를 생성하였다. 가상 환자 상에서 진행한 디지털 왁스업과 임플란트 식립 계획을 바탕으로 임시 보철물을 제작하였다. 새롭게 설정된 수직 교합 고경 상에서 적절한 견치 유도를 보이는 임시 보철물의 검증을 통해, 최종 보철물로 이행하였다. 이처럼 디지털 치의학 기술의 장점을 활용하여 환자는 저작능과 심미성의 개선에 만족하였기에 본 증례를 보고하고자 한다.

전산화단층촬영 영상에서 지방간의 감별진단을 위한 컴퓨터보조진단의 응용 (Application of Computer-Aided Diagnosis for the Differential Diagnosis of Fatty Liver in Computed Tomography Image)

  • 박형후;이진수
    • 한국방사선학회논문지
    • /
    • 제10권6호
    • /
    • pp.443-450
    • /
    • 2016
  • 본 연구는 복부 전산화단층촬영 영상을 이용하여 지방간환자의 영상을 질감특징분석과 ROC curve 분석을 하였으며, 컴퓨터보조진단시스템의 구현을 위한 실험적인 선형 연구로서 전산화단층촬영 영상에서 지방간의 객관적이고 신뢰성 있는 진단 정보를 의사에게 제공하고자 하였다. 실험은 정상 및 지방간 복부 전산화단층촬영 영상을 실험영상으로 하여 설정된 구역에 대한 wavelet 변환을 거쳐 질감의 특징값을 나타내는 6가지 파라미터로 통계적 분석 결과를 나타내었다. 그 결과 엔트로피, 평균밝기, 왜곡도는 90% 이상의 비교적 높은 인식률을 보였고, 대조도, 평탄도, 균일도는 약 70% 정도로 비교적 낮은 인식률을 나타내었다. ROC curve를 이용한 분석에서 6가지의 파라미터 모두 0.900(p=0.0001)이상을 나타내어 질환인식에 의미가 있는 결과를 나타내었다. 또한 6가지 파라미터에서 질환 예측을 위한 cut-off 값을 결정하였다. 이러한 결과는 향후 복부 전산화단층촬영 영상에서 질환 자동검출 및 최종진단의 예비 진단 자료로서 적용 가능할 것이다.

Automatic Sputum Color Image Segmentation for Lung Cancer Diagnosis

  • Taher, Fatma;Werghi, Naoufel;Al-Ahmad, Hussain
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제7권1호
    • /
    • pp.68-80
    • /
    • 2013
  • Lung cancer is considered to be the leading cause of cancer death worldwide. A technique commonly used consists of analyzing sputum images for detecting lung cancer cells. However, the analysis of sputum is time consuming and requires highly trained personnel to avoid errors. The manual screening of sputum samples has to be improved by using image processing techniques. In this paper we present a Computer Aided Diagnosis (CAD) system for early detection and diagnosis of lung cancer based on the analysis of the sputum color image with the aim to attain a high accuracy rate and to reduce the time consumed to analyze such sputum samples. In order to form general diagnostic rules, we present a framework for segmentation and extraction of sputum cells in sputum images using respectively, a Bayesian classification method followed by region detection and feature extraction techniques to determine the shape of the nuclei inside the sputum cells. The final results will be used for a (CAD) system for early detection of lung cancer. We analyzed the performance of a Bayesian classification with respect to the color space representation and quantification. Our methods were validated via a series of experimentation conducted with a data set of 100 images. Our evaluation criteria were based on sensitivity, specificity and accuracy.

As how artificial intelligence is revolutionizing endoscopy

  • Jean-Francois Rey
    • Clinical Endoscopy
    • /
    • 제57권3호
    • /
    • pp.302-308
    • /
    • 2024
  • With incessant advances in information technology and its implications in all domains of our lives, artificial intelligence (AI) has emerged as a requirement for improved machine performance. This brings forth the query of how this can benefit endoscopists and improve both diagnostic and therapeutic endoscopy in each part of the gastrointestinal tract. Additionally, it also raises the question of the recent benefits and clinical usefulness of this new technology in daily endoscopic practice. There are two main categories of AI systems: computer-assisted detection (CADe) for lesion detection and computer-assisted diagnosis (CADx) for optical biopsy and lesion characterization. Quality assurance is the next step in the complete monitoring of high-quality colonoscopies. In all cases, computer-aided endoscopy is used, as the overall results rely on the physician. Video capsule endoscopy is a unique example in which a computer operates a device, stores multiple images, and performs an accurate diagnosis. While there are many expectations, we need to standardize and assess various software packages. It is important for healthcare providers to support this new development and make its use an obligation in daily clinical practice. In summary, AI represents a breakthrough in digestive endoscopy. Screening for gastric and colonic cancer detection should be improved, particularly outside expert centers. Prospective and multicenter trials are mandatory before introducing new software into clinical practice.

고속 무한궤도 차량용 변속제어기 진단 알고리즘 분석 (Analysis of Diagnosis Algorithm Implemented in TCU for High-Speed Tracked Vehicles)

  • 정규홍
    • 드라이브 ㆍ 컨트롤
    • /
    • 제15권4호
    • /
    • pp.30-38
    • /
    • 2018
  • Electronic control units (ECUs) are currently popular, and have evolved further towards the high-end application of autonomous vehicles in the automotive industry. Such digital technologies have also become widespread, in agriculture and construction equipment. Likewise, transmission control of high-speed tracked vehicles is based on the transmission control unit (TCU), performing complex gear change control functions, and diagnostic algorithms (a TCU's self-diagnostic and reporting capability of malfunction data through CAN communication). Since all functions of TCU are implemented by embedded-software, it is hardly possible to analyze specifications by reverse engineering. In this paper a real-time transmission simulator adaptable to TCU is presented, for analysis of diagnosis algorithm and standards. Signal simulation circuits are deliberately designed considering electrical characteristics of TCU inputs and various analysis tools, such as analog input auto scan function, and global output enable switch, are implemented in software. Test results from hardware-in-the-loop simulator verify tolerance time for each error, as well as cause of fault, error reset conditions.