• Title/Summary/Keyword: age cohort

Search Result 710, Processing Time 0.022 seconds

Differentiating Uterine Sarcoma From Atypical Leiomyoma on Preoperative Magnetic Resonance Imaging Using Logistic Regression Classifier: Added Value of Diffusion-Weighted Imaging-Based Quantitative Parameters

  • Hokun Kim;Sung Eun Rha;Yu Ri Shin;Eu Hyun Kim;Soo Youn Park;Su-Lim Lee;Ahwon Lee;Mee-Ran Kim
    • Korean Journal of Radiology
    • /
    • v.25 no.1
    • /
    • pp.43-54
    • /
    • 2024
  • Objective: To evaluate the added value of diffusion-weighted imaging (DWI)-based quantitative parameters to distinguish uterine sarcomas from atypical leiomyomas on preoperative magnetic resonance imaging (MRI). Materials and Methods: A total of 138 patients (age, 43.7 ± 10.3 years) with uterine sarcoma (n = 44) and atypical leiomyoma (n = 94) were retrospectively collected from four institutions. The cohort was randomly divided into training (84/138, 60.0%) and validation (54/138, 40.0%) sets. Two independent readers evaluated six qualitative MRI features and two DWI-based quantitative parameters for each index tumor. Multivariable logistic regression was used to identify the relevant qualitative MRI features. Diagnostic classifiers based on qualitative MRI features alone and in combination with DWI-based quantitative parameters were developed using a logistic regression algorithm. The diagnostic performance of the classifiers was evaluated using a cross-table analysis and calculation of the area under the receiver operating characteristic curve (AUC). Results: Mean apparent diffusion coefficient value of uterine sarcoma was lower than that of atypical leiomyoma (mean ± standard deviation, 0.94 ± 0.30 10-3 mm2/s vs. 1.23 ± 0.25 10-3 mm2/s; P < 0.001), and the relative contrast ratio was higher in the uterine sarcoma (8.16 ± 2.94 vs. 4.19 ± 2.66; P < 0.001). Selected qualitative MRI features included ill-defined margin (adjusted odds ratio [aOR], 17.9; 95% confidence interval [CI], 1.41-503, P = 0.040), intratumoral hemorrhage (aOR, 27.3; 95% CI, 3.74-596, P = 0.006), and absence of T2 dark area (aOR, 83.5; 95% CI, 12.4-1916, P < 0.001). The classifier that combined qualitative MRI features and DWI-based quantitative parameters showed significantly better performance than without DWI-based parameters in the validation set (AUC, 0.92 vs. 0.78; P < 0.001). Conclusion: The addition of DWI-based quantitative parameters to qualitative MRI features improved the diagnostic performance of the logistic regression classifier in differentiating uterine sarcomas from atypical leiomyomas on preoperative MRI.

Artificial Intelligence-Based Identification of Normal Chest Radiographs: A Simulation Study in a Multicenter Health Screening Cohort

  • Hyunsuk Yoo;Eun Young Kim;Hyungjin Kim;Ye Ra Choi;Moon Young Kim;Sung Ho Hwang;Young Joong Kim;Young Jun Cho;Kwang Nam Jin
    • Korean Journal of Radiology
    • /
    • v.23 no.10
    • /
    • pp.1009-1018
    • /
    • 2022
  • Objective: This study aimed to investigate the feasibility of using artificial intelligence (AI) to identify normal chest radiography (CXR) from the worklist of radiologists in a health-screening environment. Materials and Methods: This retrospective simulation study was conducted using the CXRs of 5887 adults (mean age ± standard deviation, 55.4 ± 11.8 years; male, 4329) from three health screening centers in South Korea using a commercial AI (Lunit INSIGHT CXR3, version 3.5.8.8). Three board-certified thoracic radiologists reviewed CXR images for referable thoracic abnormalities and grouped the images into those with visible referable abnormalities (identified as abnormal by at least one reader) and those with clearly visible referable abnormalities (identified as abnormal by at least two readers). With AI-based simulated exclusion of normal CXR images, the percentages of normal images sorted and abnormal images erroneously removed were analyzed. Additionally, in a random subsample of 480 patients, the ability to identify visible referable abnormalities was compared among AI-unassisted reading (i.e., all images read by human readers without AI), AI-assisted reading (i.e., all images read by human readers with AI assistance as concurrent readers), and reading with AI triage (i.e., human reading of only those rendered abnormal by AI). Results: Of 5887 CXR images, 405 (6.9%) and 227 (3.9%) contained visible and clearly visible abnormalities, respectively. With AI-based triage, 42.9% (2354/5482) of normal CXR images were removed at the cost of erroneous removal of 3.5% (14/405) and 1.8% (4/227) of CXR images with visible and clearly visible abnormalities, respectively. In the diagnostic performance study, AI triage removed 41.6% (188/452) of normal images from the worklist without missing visible abnormalities and increased the specificity for some readers without decreasing sensitivity. Conclusion: This study suggests the feasibility of sorting and removing normal CXRs using AI with a tailored cut-off to increase efficiency and reduce the workload of radiologists.

Non-Contrast Cine Cardiac Magnetic Resonance Derived-Radiomics for the Prediction of Left Ventricular Adverse Remodeling in Patients With ST-Segment Elevation Myocardial Infarction

  • Xin A;Mingliang Liu;Tong Chen;Feng Chen;Geng Qian;Ying Zhang;Yundai Chen
    • Korean Journal of Radiology
    • /
    • v.24 no.9
    • /
    • pp.827-837
    • /
    • 2023
  • Objective: To investigate the predictive value of radiomics features based on cardiac magnetic resonance (CMR) cine images for left ventricular adverse remodeling (LVAR) after acute ST-segment elevation myocardial infarction (STEMI). Materials and Methods: We conducted a retrospective, single-center, cohort study involving 244 patients (random-split into 170 and 74 for training and testing, respectively) having an acute STEMI (88.5% males, 57.0 ± 10.3 years of age) who underwent CMR examination at one week and six months after percutaneous coronary intervention. LVAR was defined as a 20% increase in left ventricular end-diastolic volume 6 months after acute STEMI. Radiomics features were extracted from the oneweek CMR cine images using the least absolute shrinkage and selection operator regression (LASSO) analysis. The predictive performance of the selected features was evaluated using receiver operating characteristic curve analysis and the area under the curve (AUC). Results: Nine radiomics features with non-zero coefficients were included in the LASSO regression of the radiomics score (RAD score). Infarct size (odds ratio [OR]: 1.04 (1.00-1.07); P = 0.031) and RAD score (OR: 3.43 (2.34-5.28); P < 0.001) were independent predictors of LVAR. The RAD score predicted LVAR, with an AUC (95% confidence interval [CI]) of 0.82 (0.75-0.89) in the training set and 0.75 (0.62-0.89) in the testing set. Combining the RAD score with infarct size yielded favorable performance in predicting LVAR, with an AUC of 0.84 (0.72-0.95). Moreover, the addition of the RAD score to the left ventricular ejection fraction (LVEF) significantly increased the AUC from 0.68 (0.52-0.84) to 0.82 (0.70-0.93) (P = 0.018), which was also comparable to the prediction provided by the combined microvascular obstruction, infarct size, and LVEF with an AUC of 0.79 (0.65-0.94) (P = 0.727). Conclusion: Radiomics analysis using non-contrast cine CMR can predict LVAR after STEMI independently and incrementally to LVEF and may provide an alternative to traditional CMR parameters.

Prediction of Decompensation and Death in Advanced Chronic Liver Disease Using Deep Learning Analysis of Gadoxetic Acid-Enhanced MRI

  • Subin Heo;Seung Soo Lee;So Yeon Kim;Young-Suk Lim;Hyo Jung Park;Jee Seok Yoon;Heung-Il Suk;Yu Sub Sung;Bumwoo Park;Ji Sung Lee
    • Korean Journal of Radiology
    • /
    • v.23 no.12
    • /
    • pp.1269-1280
    • /
    • 2022
  • Objective: This study aimed to evaluate the usefulness of quantitative indices obtained from deep learning analysis of gadoxetic acid-enhanced hepatobiliary phase (HBP) MRI and their longitudinal changes in predicting decompensation and death in patients with advanced chronic liver disease (ACLD). Materials and Methods: We included patients who underwent baseline and 1-year follow-up MRI from a prospective cohort that underwent gadoxetic acid-enhanced MRI for hepatocellular carcinoma surveillance between November 2011 and August 2012 at a tertiary medical center. Baseline liver condition was categorized as non-ACLD, compensated ACLD, and decompensated ACLD. The liver-to-spleen signal intensity ratio (LS-SIR) and liver-to-spleen volume ratio (LS-VR) were automatically measured on the HBP images using a deep learning algorithm, and their percentage changes at the 1-year follow-up (ΔLS-SIR and ΔLS-VR) were calculated. The associations of the MRI indices with hepatic decompensation and a composite endpoint of liver-related death or transplantation were evaluated using a competing risk analysis with multivariable Fine and Gray regression models, including baseline parameters alone and both baseline and follow-up parameters. Results: Our study included 280 patients (153 male; mean age ± standard deviation, 57 ± 7.95 years) with non-ACLD, compensated ACLD, and decompensated ACLD in 32, 186, and 62 patients, respectively. Patients were followed for 11-117 months (median, 104 months). In patients with compensated ACLD, baseline LS-SIR (sub-distribution hazard ratio [sHR], 0.81; p = 0.034) and LS-VR (sHR, 0.71; p = 0.01) were independently associated with hepatic decompensation. The ΔLS-VR (sHR, 0.54; p = 0.002) was predictive of hepatic decompensation after adjusting for baseline variables. ΔLS-VR was an independent predictor of liver-related death or transplantation in patients with compensated ACLD (sHR, 0.46; p = 0.026) and decompensated ACLD (sHR, 0.61; p = 0.023). Conclusion: MRI indices automatically derived from the deep learning analysis of gadoxetic acid-enhanced HBP MRI can be used as prognostic markers in patients with ACLD.

Outcomes of Completion Lobectomy for Locoregional Recurrence after Sublobar Resection in Patients with Non-small Cell Lung Cancer

  • Cho Eun Lee;Jeonghee Yun;Yeong Jeong Jeon;Junghee Lee;Seong Yong Park;Jong Ho Cho;Hong Kwan Kim;Yong Soo Choi;Jhingook Kim;Young Mog Shim
    • Journal of Chest Surgery
    • /
    • v.57 no.2
    • /
    • pp.128-135
    • /
    • 2024
  • Background: This retrospective study aimed to determine the treatment patterns and the surgical and oncologic outcomes after completion lobectomy (CL) in patients with locoregionally recurrent stage I non-small cell lung cancer (NSCLC) who previously underwent sublobar resection. Methods: Data from 36 patients who initially underwent sublobar resection for clinical, pathological stage IA NSCLC and experienced locoregional recurrence between 2008 and 2016 were analyzed. Results: Thirty-six (3.6%) of 1,003 patients who underwent sublobar resection for NSCLC experienced locoregional recurrence. The patients' median age was 66.5 (range, 44-77) years at the initial operation, and 28 (77.8%) patients were men. Six (16.7%) patients underwent segmentectomy and 30 (83.3%) underwent wedge resection as the initial operation. The median follow-up from the initial operation was 56 (range, 9-150) months. Ten (27.8%) patients underwent CL, 22 (61.1%) underwent non-surgical treatments (chemotherapy, radiation, concurrent chemoradiation therapy), and 4 (11.1%) did not receive treatment or were lost to follow-up after recurrence. Patients who underwent CL experienced no significant complications or deaths. The median follow-up time after CL was 64.5 (range, 19-93) months. The 5-year overall survival (OS) and post-recurrence survival (PRS) were higher in the surgical group than in the non-surgical (p<0.001) and no-treatment groups (p<0.001). Conclusion: CL is a technically demanding but safe procedure for locoregionally recurrent stage I NSCLC after sublobar resection. Patients who underwent CL had better OS and PRS than patients who underwent non-surgical treatments or no treatments; however, a larger cohort study and long-term surveillance are necessary.

Chest wall injury fracture patterns are associated with different mechanisms of injury: a retrospective review study in the United States

  • Jennifer M. Brewer;Owen P. Karsmarski;Jeremy Fridling;T. Russell Hill;Chasen J. Greig;Sarah E. Posillico;Carol McGuiness;Erin McLaughlin;Stephanie C. Montgomery;Manuel Moutinho;Ronald Gross;Evert A. Eriksson;Andrew R. Doben
    • Journal of Trauma and Injury
    • /
    • v.37 no.1
    • /
    • pp.48-59
    • /
    • 2024
  • Purpose: Research on rib fracture management has exponentially increased. Predicting fracture patterns based on the mechanism of injury (MOI) and other possible correlations may improve resource allocation and injury prevention strategies. The Chest Injury International Database (CIID) is the largest prospective repository of the operative and nonoperative management of patients with severe chest wall trauma. The purpose of this study was to determine whether the MOI is associated with the resulting rib fracture patterns. We hypothesized that specific MOIs would be associated with distinct rib fracture patterns. Methods: The CIID was queried to analyze fracture patterns based on the MOI. Patients were stratified by MOI: falls, motor vehicle collisions (MVCs), motorcycle collisions (MCCs), automobile-pedestrian collisions, and bicycle collisions. Fracture locations, associated injuries, and patient-specific variables were recorded. Heat maps were created to display the fracture incidence by rib location. Results: The study cohort consisted of 1,121 patients with a median RibScore of 2 (range, 0-3) and 9,353 fractures. The average age was 57±20 years, and 64% of patients were male. By MOI, the number of patients and fractures were as follows: falls (474 patients, 3,360 fractures), MVCs (353 patients, 3,268 fractures), MCCs (165 patients, 1,505 fractures), automobile-pedestrian collisions (70 patients, 713 fractures), and bicycle collisions (59 patients, 507 fractures). The most commonly injured rib was the sixth rib, and the most common fracture location was lateral. Statistically significant differences in the location and patterns of fractures were identified comparing each MOI, except for MCCs versus bicycle collisions. Conclusions: Different mechanisms of injury result in distinct rib fracture patterns. These different patterns should be considered in the workup and management of patients with thoracic injuries. Given these significant differences, future studies should account for both fracture location and the MOI to better define what populations benefit from surgical versus nonoperative management.

Liver-to-Spleen Volume Ratio Automatically Measured on CT Predicts Decompensation in Patients with B Viral Compensated Cirrhosis

  • Ji Hye Kwon;Seung Soo Lee;Jee Seok Yoon;Heung-Il Suk;Yu Sub Sung;Ho Sung Kim;Chul-min Lee;Kang Mo Kim;So Jung Lee;So Yeon Kim
    • Korean Journal of Radiology
    • /
    • v.22 no.12
    • /
    • pp.1985-1995
    • /
    • 2021
  • Objective: Although the liver-to-spleen volume ratio (LSVR) based on CT reflects portal hypertension, its prognostic role in cirrhotic patients has not been proven. We evaluated the utility of LSVR, automatically measured from CT images using a deep learning algorithm, as a predictor of hepatic decompensation and transplantation-free survival in patients with hepatitis B viral (HBV)-compensated cirrhosis. Materials and Methods: A deep learning algorithm was used to measure the LSVR in a cohort of 1027 consecutive patients (mean age, 50.5 years; 675 male and 352 female) with HBV-compensated cirrhosis who underwent liver CT (2007-2010). Associations of LSVR with hepatic decompensation and transplantation-free survival were evaluated using multivariable Cox proportional hazards and competing risk analyses, accounting for either the Child-Pugh score (CPS) or Model for End Stage Liver Disease (MELD) score and other variables. The risk of the liver-related events was estimated using Kaplan-Meier analysis and the Aalen-Johansen estimator. Results: After adjustment for either CPS or MELD and other variables, LSVR was identified as a significant independent predictor of hepatic decompensation (hazard ratio for LSVR increase by 1, 0.71 and 0.68 for CPS and MELD models, respectively; p < 0.001) and transplantation-free survival (hazard ratio for LSVR increase by 1, 0.8 and 0.77, respectively; p < 0.001). Patients with an LSVR of < 2.9 (n = 381) had significantly higher 3-year risks of hepatic decompensation (16.7% vs. 2.5%, p < 0.001) and liver-related death or transplantation (10.0% vs. 1.1%, p < 0.001) than those with an LSVR ≥ 2.9 (n = 646). When patients were stratified according to CPS (Child-Pugh A vs. B-C) and MELD (< 10 vs. ≥ 10), an LSVR of < 2.9 was still associated with a higher risk of liver-related events than an LSVR of ≥ 2.9 for all Child-Pugh (p ≤ 0.045) and MELD (p ≤ 0.009) stratifications. Conclusion: The LSVR measured on CT can predict hepatic decompensation and transplantation-free survival in patients with HBV-compensated cirrhosis.

Diffusion-Weighted Imaging for Differentiation of Biliary Atresia and Grading of Hepatic Fibrosis in Infants with Cholestasis

  • Jisoo Kim;Hyun Joo Shin;Haesung Yoon;Seok Joo Han;Hong Koh;Myung-Joon Kim;Mi-Jung Lee
    • Korean Journal of Radiology
    • /
    • v.22 no.2
    • /
    • pp.253-262
    • /
    • 2021
  • Objective: To determine whether the values of hepatic apparent diffusion coefficient (ADC) can differentiate biliary atresia (BA) from non-BA or be correlated with the grade of hepatic fibrosis in infants with cholestasis. Materials and Methods: This retrospective cohort study included infants who received liver MRI examinations to evaluate cholestasis from July 2009 to October 2017. Liver ADC, ADC ratio of liver/spleen, aspartate aminotransferase to platelet ratio index (APRI), and spleen size were compared between the BA and non-BA groups. The diagnostic performances of all parameters for significant fibrosis (F3-4) were obtained by receiver-operating characteristics (ROCs) curve analysis. Results: Altogether, 227 infants (98 males and 129 females, mean age = 57.2 ± 36.3 days) including 125 BA patients were analyzed. The absolute ADC difference between two reviewers was 0.10 mm2/s for both liver and spleen. Liver ADC value was specific (80.4%) and ADC ratio was sensitive (88.0%) for the diagnosis of BA with comparable performance. There were 33 patients with F0, 15 with F1, 71 with F2, 35 with F3, and 11 with F4. All four parameters of APRI (τ = 0.296), spleen size (τ = 0.312), liver ADC (τ = -0.206), and ADC ratio (τ = -0.288) showed significant correlation with fibrosis grade (all, p < 0.001). The cutoff values for significant fibrosis (F3-4) were 0.783 for APRI (area under the ROC curve [AUC], 0.721), 5.9 cm for spleen size (AUC, 0.719), 1.044 x 10-3 mm2/s for liver ADC (AUC, 0.673), and 1.22 for ADC ratio (AUC, 0.651). Conclusion: Liver ADC values and ADC ratio of liver/spleen showed limited additional diagnostic performance for differentiating BA from non-BA and predicting significant hepatic fibrosis in infants with cholestasis.

Imaging Predictors of Survival in Patients with Single Small Hepatocellular Carcinoma Treated with Transarterial Chemoembolization

  • Chan Park;Jin Hyoung Kim;Pyeong Hwa Kim;So Yeon Kim;Dong Il Gwon;Hee Ho Chu;Minho Park;Joonho Hur;Jin Young Kim;Dong Joon Kim
    • Korean Journal of Radiology
    • /
    • v.22 no.2
    • /
    • pp.213-224
    • /
    • 2021
  • Objective: Clinical outcomes of patients who undergo transarterial chemoembolization (TACE) for single small hepatocellular carcinoma (HCC) are not consistent, and may differ based on certain imaging findings. This retrospective study was aimed at determining the efficacy of pre-TACE CT or MR imaging findings in predicting survival outcomes in patients with small HCC upon being treated with TACE. Besides, the study proposed to build a risk prediction model for these patients. Materials and Methods: Altogether, 750 patients with functionally good hepatic reserve who received TACE as the first-line treatment for single small HCC between 2004 and 2014 were included in the study. These patients were randomly assigned into training (n = 525) and validation (n = 225) sets. Results: According to the results of a multivariable Cox analysis, three pre-TACE imaging findings (tumor margin, tumor location, enhancement pattern) and two clinical factors (age, serum albumin level) were selected and scored to create predictive models for overall, local tumor progression (LTP)-free, and progression-free survival in the training set. The median overall survival time in the validation set were 137.5 months, 76.1 months, and 44.0 months for low-, intermediate-, and high-risk groups, respectively (p < 0.001). Time-dependent receiver operating characteristic curves of the predictive models for overall, LTP-free, and progression-free survival applied to the validation cohort showed acceptable areas under the curve values (0.734, 0.802, and 0.775 for overall survival; 0.738, 0.789, and 0.791 for LTP-free survival; and 0.671, 0.733, and 0.694 for progression-free survival at 3, 5, and 10 years, respectively). Conclusion: Pre-TACE CT or MR imaging findings could predict survival outcomes in patients with small HCC upon treatment with TACE. Our predictive models including three imaging predictors could be helpful in prognostication, identification, and selection of suitable candidates for TACE in patients with single small HCC.

Importance of an Integrated Assessment of Functional Disability and Work Ability in Workers Affected by Low Back Pain

  • Fabrizio Russo;Cristina Di Tecco;Simone Russo;Giorgia Petrucci;Gianluca Vadala;Vincenzo Denaro;Sergio Iavicoli
    • Safety and Health at Work
    • /
    • v.15 no.1
    • /
    • pp.66-72
    • /
    • 2024
  • Background: This study examines the relationship between functional disability and work ability in workers affected by low back pain (LBP) through an analysis of correlations between the Oswestry Disability Index (ODI) and Work Ability Index (WAI). The role of personal and work factors on functional disability/work ability levels has also been studied. LBP is the most common musculoskeletal problem and a major disabling health problem worldwide. Its etiology is multifactorial. Multidisciplinary approaches may help reduce the burden of pain and disability and improve job continuity and reintegration at work. Methods: A cohort of 264 patients affected by LBP from an Italian outpatient clinic were included in a clinical diagnostic/therapeutic trial aiming at rehabilitation and return to work through an integrated investigation protocol. Data were collected during the first medical examination using anamnestic and clinical tools. The final sample is composed of 252 patients, 57.1% man, 44.0 % blue collars, 46.4% with the high school degree, 45.6% married. Results: WAI and ODI reported a negative and fair correlation (r = -0.454; p = .000). Workers with acute LBP symptoms have a higher probability of severe disability than those with chronic LBP symptoms. White collars without depressive symptoms reported higher work ability - even in chronic disability conditions-than those with depressive symptoms. Conclusion: The study found that ODI and WAI have a convergent validity and this suggests that the two tools measure capture distinctive aspects of disability related to personal, environmental, and occupational characteristics. The most important and modifiable prognostic factors found for ODI and WAI were depressive symptoms, workday absence, and intensity of back pain. The study also found a mild association between age and ODI. The study's findings highlight the importance of using a multidisciplinary approach to manage and prevent disability due to LBP.