• Title/Summary/Keyword: risk categories

Search Result 470, Processing Time 0.029 seconds

Classification of Ground Subsidence Factors for Prediction of Ground Subsidence Risk (GSR) (굴착공사 중 지반함몰 위험예측을 위한 지반함몰인자 분류)

  • Park, Jin Young;Jang, Eugene;Kim, Hak Joon;Ihm, Myeong Hyeok
    • The Journal of Engineering Geology
    • /
    • v.27 no.2
    • /
    • pp.153-164
    • /
    • 2017
  • The geological factors for causing ground subsidence are very diverse. It can be affected by any geological or extrinsic influences, and even within the same geological factor, the soil depression impact factor can be determined by different physical properties. As a result of reviewing a large number of papers and case histories, it can be seen that there are seven categories of ground subsidence factors. The depth and thickness of the overburden can affect the subsidence depending on the existence of the cavity, whereas the depth and orientation of the boundary between soil and rock are dominant factors in the ground composed of soil and rock. In case of soil layers, more various influencing factors exist such as type of soil, shear strength, relative density and degree of compaction, dry unit weight, water content, and liquid limit. The type of rock, distance from the main fracture and RQD can be influential factors in the bedrock. When approaching from the hydrogeological point of view, the rainfall intensity, the distance and the depth from the main channel, the coefficient of permeability and fluctuation of ground water level can influence to ground subsidence. It is also possible that the ground subsidence can be affected by external factors such as the depth of excavation and distance from the earth retaining wall, groundwater treatment methods at excavation work, and existence of artificial facilities such as sewer pipes. It is estimated that to evaluate the ground subsidence factor during the construction of underground structures in urban areas will be essential. It is expected that ground subsidence factors examined in this study will contribute for the reliable evaluation of the ground subsidence risk.

The Incidence and Risk Factors of Metabolic Syndrome in Rural Area (농촌지역 주민의 대사증후군 발생률과 위험요인)

  • Yoon, Hee-Jung;Lee, Sung-Kook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.6
    • /
    • pp.3934-3943
    • /
    • 2015
  • This study was conducted to investigate the incidence rate of metabolic syndrome, and the related factors were examined. 620 persons who had participated in both initial and second survey were selected. Initial survey was performed at the year of 2006 and second survey was performed at the year of 2010. Among them, 460 persons who didn't initially have a metabolic syndrome were selected as the final study subjects. They were classified into 2 categories; stationary normal group (352, 76.5%), and metabolic syndrome incidence group (108, 23.5%). The incidence rate of metabolic syndrome for the subjects was 25.2 per 1,000 person years. Sex, obesity, and smoking had significant effect on the incidence of metabolic syndrome. In multiple logistic regression analysis,after controlling variables, obesity index was found to be major factor in the incidence of metabolic syndrome. The risk of metabolic syndrome was increased by overweight or obesity. The strategy to control body weight should be emphasized for prevention of metabolic syndrome.

Comparative Study of Korean Workers' Exposure to Dichloromethane by Process Category between Work Environment Monitoring Program and ECETOC TRA (국내 디클로로메탄 제조·사용 사업장 근로자의 공정별 노출수준에 대한 작업환경측정값과 ECETOC TRA 모델값 비교연구)

  • Jeong, Sujin;Bae, Gyewan;Lee, Naroo
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.31 no.4
    • /
    • pp.317-330
    • /
    • 2021
  • Objectives: By law, companies in Korea must periodically measure workers' exposure to harmful chemicals (the system is called the Work Environment Monitoring Program (WMP)[a]) and report the results to the government. The government also measures exposure to monitor the WMP's reliability (called Reliability Assessment (RA) for WMP[b]). The issue is that measured data from these two sources are so different that the objectivity of WMP needs to be confirmed by comparing the results using the European Centre for Ecotoxicology and Toxicology of Chemicals' Targeted Risk Assessment (ECETOC TRA). Methods: Step 1: Data collection from WMP reports submitted by companies (n=586) and RA for WMP written by the government (n=33). Step 2: Data Standardization by key information included. Step 3: Data conversion to input-variables required to run the ECETOC TRA model, and run the model with specific data (n=514) which meet the predetermined exposure scenario. Step 4: Statistical data analysis by process category (PROC) and ventilation type from each source ([A] and [B]). Step 5: Additional analysis of any unexpected results. Results: The process categories of the production and handling of Dichloromethane were classified into 12 PROCs, and ten of them were selected to run ECETOC TRA. Modeled values tended to be higher than measured values from both sources. For the measured values from WMP, RCR distribution by PROC was narrow (0.197-0.267, 95% CI) and did not have a relationship with ventilation type, which differs from the tendency of the modeling result. Meanwhile, the measured values from RA for WMP were relatively widely distributed (0.301-1.177, 95% CI) by PROC. In particular PROCs (13,19) were high enough to exceed 1. Also, they become low with better ventilation types and appear differently depending on the ventilation type, similar to the model result. Conclusions: This study revealed that ECETOC TRA might have the potential to serve as a screening tool for exposure assessment and to be used as assistive method for WMP to estimate exposure. Further empirical study is required to confirm its availability as a screening tool.

Factors associated hospital admission in patients with low acuity visiting emergency department (응급실 방문환자 중 낮은 우선순위를 가진 환자의 입원에 영향을 주는 요소)

  • Oh, Min Taek;Lee, Seong Hwa;Park, Seong Wook;Park, Soon Chang;Kim, Hyung Bin;Jo, Young Mo;Bae, Byung Gwan;Wang, Il Jae
    • Journal of The Korean Society of Emergency Medicine
    • /
    • v.29 no.5
    • /
    • pp.408-414
    • /
    • 2018
  • Objective: Patients with low acuity who need hospitalization may be at risk if they do not receive proper treatment in overcrowded emergency rooms. This study was conducted to investigate factors affecting the hospitalization of patients with low acuity of Korean Triage and Acuity Scale (KTAS). Methods: This study was a retrospective chart review analysis of patients aged 15 years or older who had triaged as KTAS 4 and 5 grades when visiting a local emergency medical center from January 1, 2016 to December 31, 2017. Multivariate logistic analysis was performed to analyze the effects of age, sex, reasons for visiting, visiting route, ambulance utilization, KTAS grade and major category on patient admission. Results: A total of 10,540 patients were enrolled and the odds ratio (OR) increased with age from those aged over 34 years (P<0.001). Patients that triaged as KTAS grade 5 (adjusted OR, 1.57; 95% confidence interval [CI], 1.36-1.82), had a condition caused by disease (adjusted OR, 2.31; 95% CI, 2.00-2.68), and visited by using an ambulance (public: adjusted OR, 1.05; 95% CI, 0.91-1.22; private: adjusted OR, 4.60; 95% CI, 3.85-5.49) were more likely to be hospitalized. Individuals in the "general" major category were more likely to be hospitalized than those falling into other major categories (P<0.001). Conclusion: The factors influencing the hospitalization of patients with low acuity were age, reasons for visiting, visiting route, ambulance utilization, KTAS grade and major category on patient admission.

Association between a Genetic Variant of CACNA1C and the Risk of Schizophrenia and Bipolar I Disorder Across Diagnostic Boundaries (조현병과 제1형 양극성장애의 진단 경계를 넘어선 공통적 후보유전자로서의 CACNA1C에 대한 단일염기다형성 연합 연구)

  • Lee, Bora;Baek, Ji Hyun;Cho, Eun Young;Yang, So-Yung;Choi, Yoo Jin;Lee, Yu-Sang;Ha, Kyooseob;Hong, Kyung Sue
    • Korean Journal of Schizophrenia Research
    • /
    • v.21 no.2
    • /
    • pp.43-50
    • /
    • 2018
  • Objectives : Genome-wide association studies (GWASs) and meta-analyses indicate that single-nucleotide polymorphisms (SNPs) in the a-1C subunit of the L-type voltage-dependent calcium channel (CACNA1C) gene increase the risk for schizophrenia and bipolar disorders (BDs). We investigated the association between the genetic variants on CACNA1C and schizophrenia and/or BDs in the Korean population. Methods : A total of 582 patients with schizophrenia, 336 patients with BDs consisting of 179 bipolar I disorder (BD-I) and 157 bipolar II disorder (BD-II), and 502 healthy controls were recruited. Based on previous results from other populations, three SNPs (rs10848635, rs1006737, and rs4765905) were selected and genotype-wise association was evaluated using logistic regression analysis under additive, dominant and recessive genetic models. Results : rs10848635 showed a significant association with schizophrenia (p=0.010), the combined schizophrenia and BD group (p=0.018), and the combined schizophrenia and BD-I group (p=0.011). The best fit model was dominant model for all of these phenotypes. The association remained significant after correction for multiple testing in schizophrenia and the combined schizophrenia and BD-I group. Conclusion : We identified a possible role of CACNA1C in the common susceptibility of schizophrenia and BD-I. However no association trend was observed for BD-II. Further efforts are needed to identify a specific phenotype associated with this gene crossing the current diagnostic categories.

International Case Study and Strategy Proposal for IUCN Red List of Ecosystem(RLE) Assessment in South Korea (국내 IUCN Red List of Ecosystem(생태계 적색목록) 평가를 위한 국제 사례 연구와 전략 제시)

  • Sang-Hak Han;Sung-Ryong Kang
    • Journal of Wetlands Research
    • /
    • v.25 no.4
    • /
    • pp.408-416
    • /
    • 2023
  • The IUCN Red List of Ecosystems serves as a global standard for assessing and identifying ecosystems at high risk of biodiversity loss, providing scientific evidence necessary for effective ecosystem management and conservation policy formulation. The IUCN Red List of Ecosystems has been designated as a key indicator (A.1) for Goal A of the Kunming-Montreal Global Biodiversity Framework. The assessment of the Red List of Ecosystems discerns signs of ecosystem collapse through specific criteria: reduction in distribution (Criterion A), restricted distribution (Criterion B), environmental degradation (Criterion C), changes in biological interaction (Criterion D), and quantitative estimation of the risk of ecosystem collapse (Criterion E). Since 2014, the IUCN Red List of Ecosystems has been evaluated in over 110 countries, with more than 80% of the assessments conducted in terrestrial and inland water ecosystems, among which tropical and subtropical forests are distributed ecosystems under threat. The assessment criteria are concentrated on spatial signs (Criteria A and B), accounting for 68.8%. There are three main considerations for applying the Red List of Ecosystems assessment domestically: First, it is necessary to compile applicable terrestrial ecosystem types within the country. Second, it must be determined whether the spatial sign assessment among the Red List of Ecosystems categories can be applied to the various small-scale ecosystems found domestically. Lastly, the collection of usable time series data (50 years) for assessment must be considered. Based on these considerations, applying the IUCN Red List of Ecosystems assessment domestically would enable an accurate understanding of the current state of the country's unique ecosystem types, contributing to global efforts in ecosystem conservation and restoration.

Validation of CT-Based Risk Stratification System for Lymph Node Metastasis in Patients With Thyroid Cancer

  • Yun Hwa Roh;Sae Rom Chung;Jung Hwan Baek;Young Jun Choi;Tae-Yon Sung;Dong Eun Song;Tae Yong Kim;Jeong Hyun Lee
    • Korean Journal of Radiology
    • /
    • v.24 no.10
    • /
    • pp.1028-1037
    • /
    • 2023
  • Objective: To evaluate the computed tomography (CT) features for diagnosing metastatic cervical lymph nodes (LNs) in patients with differentiated thyroid cancer (DTC) and validate the CT-based risk stratification system suggested by the Korean Thyroid Imaging Reporting and Data System (K-TIRADS) guidelines. Materials and Methods: A total of 463 LNs from 399 patients with DTC who underwent preoperative CT staging and ultrasound-guided fine-needle aspiration were included. The following CT features for each LN were evaluated: absence of hilum, cystic changes, calcification, strong enhancement, and heterogeneous enhancement. Multivariable logistic regression analysis was performed to identify independent CT features associated with metastatic LNs, and their diagnostic performances were evaluated. LNs were classified into probably benign, indeterminate, and suspicious categories according to the K-TIRADS and the modified LN classification proposed in our study. The diagnostic performance of both classification systems was compared using the exact McNemar and Kosinski tests. Results: The absence of hilum (odds ratio [OR], 4.859; 95% confidence interval [CI], 1.593-14.823; P = 0.005), strong enhancement (OR, 28.755; 95% CI, 12.719-65.007; P < 0.001), and cystic changes (OR, 46.157; 95% CI, 5.07-420.234; P = 0.001) were independently associated with metastatic LNs. All LNs showing calcification were diagnosed as metastases. Heterogeneous enhancement did not show a significant independent association with metastatic LNs. Strong enhancement, calcification, and cystic changes showed moderate to high specificity (70.1%-100%) and positive predictive value (PPV) (91.8%-100%). The absence of the hilum showed high sensitivity (97.8%) but low specificity (34.0%). The modified LN classification, which excluded heterogeneous enhancement from the K-TIRADS, demonstrated higher specificity (70.1% vs. 62.9%, P = 0.016) and PPV (92.5% vs. 90.9%, P = 0.011) than the K-TIRADS. Conclusion: Excluding heterogeneous enhancement as a suspicious feature resulted in a higher specificity and PPV for diagnosing metastatic LNs than the K-TIRADS. Our research results may provide a basis for revising the LN classification in future guidelines.

Fully Automatic Coronary Calcium Score Software Empowered by Artificial Intelligence Technology: Validation Study Using Three CT Cohorts

  • June-Goo Lee;HeeSoo Kim;Heejun Kang;Hyun Jung Koo;Joon-Won Kang;Young-Hak Kim;Dong Hyun Yang
    • Korean Journal of Radiology
    • /
    • v.22 no.11
    • /
    • pp.1764-1776
    • /
    • 2021
  • Objective: This study aimed to validate a deep learning-based fully automatic calcium scoring (coronary artery calcium [CAC]_auto) system using previously published cardiac computed tomography (CT) cohort data with the manually segmented coronary calcium scoring (CAC_hand) system as the reference standard. Materials and Methods: We developed the CAC_auto system using 100 co-registered, non-enhanced and contrast-enhanced CT scans. For the validation of the CAC_auto system, three previously published CT cohorts (n = 2985) were chosen to represent different clinical scenarios (i.e., 2647 asymptomatic, 220 symptomatic, 118 valve disease) and four CT models. The performance of the CAC_auto system in detecting coronary calcium was determined. The reliability of the system in measuring the Agatston score as compared with CAC_hand was also evaluated per vessel and per patient using intraclass correlation coefficients (ICCs) and Bland-Altman analysis. The agreement between CAC_auto and CAC_hand based on the cardiovascular risk stratification categories (Agatston score: 0, 1-10, 11-100, 101-400, > 400) was evaluated. Results: In 2985 patients, 6218 coronary calcium lesions were identified using CAC_hand. The per-lesion sensitivity and false-positive rate of the CAC_auto system in detecting coronary calcium were 93.3% (5800 of 6218) and 0.11 false-positive lesions per patient, respectively. The CAC_auto system, in measuring the Agatston score, yielded ICCs of 0.99 for all the vessels (left main 0.91, left anterior descending 0.99, left circumflex 0.96, right coronary 0.99). The limits of agreement between CAC_auto and CAC_hand were 1.6 ± 52.2. The linearly weighted kappa value for the Agatston score categorization was 0.94. The main causes of false-positive results were image noise (29.1%, 97/333 lesions), aortic wall calcification (25.5%, 85/333 lesions), and pericardial calcification (24.3%, 81/333 lesions). Conclusion: The atlas-based CAC_auto empowered by deep learning provided accurate calcium score measurement as compared with manual method and risk category classification, which could potentially streamline CAC imaging workflows.

Development of a Malignancy Potential Binary Prediction Model Based on Deep Learning for the Mitotic Count of Local Primary Gastrointestinal Stromal Tumors

  • Jiejin Yang;Zeyang Chen;Weipeng Liu;Xiangpeng Wang;Shuai Ma;Feifei Jin;Xiaoying Wang
    • Korean Journal of Radiology
    • /
    • v.22 no.3
    • /
    • pp.344-353
    • /
    • 2021
  • Objective: The mitotic count of gastrointestinal stromal tumors (GIST) is closely associated with the risk of planting and metastasis. The purpose of this study was to develop a predictive model for the mitotic index of local primary GIST, based on deep learning algorithm. Materials and Methods: Abdominal contrast-enhanced CT images of 148 pathologically confirmed GIST cases were retrospectively collected for the development of a deep learning classification algorithm. The areas of GIST masses on the CT images were retrospectively labelled by an experienced radiologist. The postoperative pathological mitotic count was considered as the gold standard (high mitotic count, > 5/50 high-power fields [HPFs]; low mitotic count, ≤ 5/50 HPFs). A binary classification model was trained on the basis of the VGG16 convolutional neural network, using the CT images with the training set (n = 108), validation set (n = 20), and the test set (n = 20). The sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated at both, the image level and the patient level. The receiver operating characteristic curves were generated on the basis of the model prediction results and the area under curves (AUCs) were calculated. The risk categories of the tumors were predicted according to the Armed Forces Institute of Pathology criteria. Results: At the image level, the classification prediction results of the mitotic counts in the test cohort were as follows: sensitivity 85.7% (95% confidence interval [CI]: 0.834-0.877), specificity 67.5% (95% CI: 0.636-0.712), PPV 82.1% (95% CI: 0.797-0.843), NPV 73.0% (95% CI: 0.691-0.766), and AUC 0.771 (95% CI: 0.750-0.791). At the patient level, the classification prediction results in the test cohort were as follows: sensitivity 90.0% (95% CI: 0.541-0.995), specificity 70.0% (95% CI: 0.354-0.919), PPV 75.0% (95% CI: 0.428-0.933), NPV 87.5% (95% CI: 0.467-0.993), and AUC 0.800 (95% CI: 0.563-0.943). Conclusion: We developed and preliminarily verified the GIST mitotic count binary prediction model, based on the VGG convolutional neural network. The model displayed a good predictive performance.

Salt Intake Behavior and Blood Pressure: the effect of taste sensitivity and preference (소금 섭취 행태와 혈압: 맛에 대한 민감도와 선호도의 영향)

  • Kim, Jin-Hee;Choi, Man-Kyu
    • Korean Journal of Human Ecology
    • /
    • v.16 no.4
    • /
    • pp.837-848
    • /
    • 2007
  • The literature suggested that a small reduction in overall blood pressure can have a large effect on overall prevalence of hypertension, and therefore, the affect of taste preferences of the population on salt intake should be considered for long-term blood pressure intervention programs. The purpose of this study is to investigate the influence of salt taste preference and salt taste sensitivity on salt intake behavior as risk factors for high blood pressure. We collected information on blood pressure, diet and lifestyle behaviors, salt taste preference and salt taste sensitivity from 540 respondents from Suseo-dong, Seoul. Salt taste sensitivity was assessed by administering a 1% NaCl solution to the subject's tongue and measuring the perceived intensity on 10 level scale. Salt intake behavior was classified into 3 categories: frequency of high-sodium foods, practice of salt-reducing behavior and frequency of vegetable and fruit intake. Salt taste preference showed a significant relation to the subjects' blood pressure, i.e. subjects with a higher salt preference had higher blood pressure. Salt taste sensitivity did not show a significant relation to blood pressure. However, there was a positive correlation between salt taste preference and salt taste sensitivity. Among the 3 indicators used to measure salt intake behavior, the practice of salt-reducing behavior remained significantly correlated to blood pressure. Moreover, salt-reducing behavior and salt taste preference showed a significant correlation, i.e. people who do not like salty foods tend to practice more salt-reducing behavior, leading to reduced levels in blood pressure. In a population, a small reduction in overall blood pressure can have large effects in overall prevalence of hypertension, in contrast to clinical studies where achievement of an individual's normal blood pressure is emphasized. Therefore, taste preference of the population should be considered for long-term blood pressure intervention programs.