• Title/Summary/Keyword: Data Accuracy

Search Result 11,668, Processing Time 0.044 seconds

Coronary Computed Tomography Angiography for the Diagnosis of Vasospastic Angina: Comparison with Invasive Coronary Angiography and Ergonovine Provocation Test

  • Jiesuck Park;Hyung-Kwan Kim;Eun-Ah Park;Jun-Bean Park;Seung-Pyo Lee;Whal Lee;Yong-Jin Kim;Dae-Won Sohn
    • Korean Journal of Radiology
    • /
    • v.20 no.5
    • /
    • pp.719-728
    • /
    • 2019
  • Objective: To investigate the diagnostic validity of coronary computed tomography angiography (cCTA) in vasospastic angina (VA) and factors associated with discrepant results between invasive coronary angiography with the ergonovine provocation test (iCAG-EPT) and cCTA. Materials and Methods: Of the 1397 patients diagnosed with VA from 2006 to 2016, 33 patients (75 lesions) with available cCTA data from within 6 months before iCAG-EPT were included. The severity of spasm (% diameter stenosis [%DS]) on iCAGEPT and cCTA was assessed, and the difference in %DS (Δ%DS) was calculated. Δ%DS was compared after classifying the lesions according to pre-cCTA-administered sublingual nitroglycerin (SL-NG) or beta-blockers. The lesions were further categorized with %DS ≥ 50% on iCAG-EPT or cCTA defined as a significant spasm, and the diagnostic performance of cCTA on identifying significant spasm relative to iCAG-EPT was assessed. Results: Compared to lesions without SL-NG treatment, those with SL-NG treatment showed a higher Δ%DS (39.2% vs. 22.1%, p = 0.002). However, there was no difference in Δ%DS with or without beta-blocker treatment (35.1% vs. 32.6%, p = 0.643). The significant difference in Δ%DS associated with SL-NG was more prominent in patients who were aged < 60 years, were male, had body mass index < 25 kg/m2, and had no history of hypertension, diabetes, or dyslipidemia. Based on iCAG-EPT as the reference, the per-lesion-based sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of cCTA for VA diagnosis were 7.5%, 94.0%, 60.0%, 47.1%, and 48.0%, respectively. Conclusion: For patients with clinically suspected VA, confirmation with iCAG-EPT needs to be considered without completely excluding the diagnosis of VA simply based on cCTA results, although further prospective studies are required for confirmation.

Nondestructive Quantification of Corrosion in Cu Interconnects Using Smith Charts (스미스 차트를 이용한 구리 인터커텍트의 비파괴적 부식도 평가)

  • Minkyu Kang;Namgyeong Kim;Hyunwoo Nam;Tae Yeob Kang
    • Journal of the Microelectronics and Packaging Society
    • /
    • v.31 no.2
    • /
    • pp.28-35
    • /
    • 2024
  • Corrosion inside electronic packages significantly impacts the system performance and reliability, necessitating non-destructive diagnostic techniques for system health management. This study aims to present a non-destructive method for assessing corrosion in copper interconnects using the Smith chart, a tool that integrates the magnitude and phase of complex impedance for visualization. For the experiment, specimens simulating copper transmission lines were subjected to temperature and humidity cycles according to the MIL-STD-810G standard to induce corrosion. The corrosion level of the specimen was quantitatively assessed and labeled based on color changes in the R channel. S-parameters and Smith charts with progressing corrosion stages showed unique patterns corresponding to five levels of corrosion, confirming the effectiveness of the Smith chart as a tool for corrosion assessment. Furthermore, by employing data augmentation, 4,444 Smith charts representing various corrosion levels were obtained, and artificial intelligence models were trained to output the corrosion stages of copper interconnects based on the input Smith charts. Among image classification-specialized CNN and Transformer models, the ConvNeXt model achieved the highest diagnostic performance with an accuracy of 89.4%. When diagnosing the corrosion using the Smith chart, it is possible to perform a non-destructive evaluation using electronic signals. Additionally, by integrating and visualizing signal magnitude and phase information, it is expected to perform an intuitive and noise-robust diagnosis.

Product Evaluation Criteria Extraction through Online Review Analysis: Using LDA and k-Nearest Neighbor Approach (온라인 리뷰 분석을 통한 상품 평가 기준 추출: LDA 및 k-최근접 이웃 접근법을 활용하여)

  • Lee, Ji Hyeon;Jung, Sang Hyung;Kim, Jun Ho;Min, Eun Joo;Yeo, Un Yeong;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.97-117
    • /
    • 2020
  • Product evaluation criteria is an indicator describing attributes or values of products, which enable users or manufacturers measure and understand the products. When companies analyze their products or compare them with competitors, appropriate criteria must be selected for objective evaluation. The criteria should show the features of products that consumers considered when they purchased, used and evaluated the products. However, current evaluation criteria do not reflect different consumers' opinion from product to product. Previous studies tried to used online reviews from e-commerce sites that reflect consumer opinions to extract the features and topics of products and use them as evaluation criteria. However, there is still a limit that they produce irrelevant criteria to products due to extracted or improper words are not refined. To overcome this limitation, this research suggests LDA-k-NN model which extracts possible criteria words from online reviews by using LDA and refines them with k-nearest neighbor. Proposed approach starts with preparation phase, which is constructed with 6 steps. At first, it collects review data from e-commerce websites. Most e-commerce websites classify their selling items by high-level, middle-level, and low-level categories. Review data for preparation phase are gathered from each middle-level category and collapsed later, which is to present single high-level category. Next, nouns, adjectives, adverbs, and verbs are extracted from reviews by getting part of speech information using morpheme analysis module. After preprocessing, words per each topic from review are shown with LDA and only nouns in topic words are chosen as potential words for criteria. Then, words are tagged based on possibility of criteria for each middle-level category. Next, every tagged word is vectorized by pre-trained word embedding model. Finally, k-nearest neighbor case-based approach is used to classify each word with tags. After setting up preparation phase, criteria extraction phase is conducted with low-level categories. This phase starts with crawling reviews in the corresponding low-level category. Same preprocessing as preparation phase is conducted using morpheme analysis module and LDA. Possible criteria words are extracted by getting nouns from the data and vectorized by pre-trained word embedding model. Finally, evaluation criteria are extracted by refining possible criteria words using k-nearest neighbor approach and reference proportion of each word in the words set. To evaluate the performance of the proposed model, an experiment was conducted with review on '11st', one of the biggest e-commerce companies in Korea. Review data were from 'Electronics/Digital' section, one of high-level categories in 11st. For performance evaluation of suggested model, three other models were used for comparing with the suggested model; actual criteria of 11st, a model that extracts nouns by morpheme analysis module and refines them according to word frequency, and a model that extracts nouns from LDA topics and refines them by word frequency. The performance evaluation was set to predict evaluation criteria of 10 low-level categories with the suggested model and 3 models above. Criteria words extracted from each model were combined into a single words set and it was used for survey questionnaires. In the survey, respondents chose every item they consider as appropriate criteria for each category. Each model got its score when chosen words were extracted from that model. The suggested model had higher scores than other models in 8 out of 10 low-level categories. By conducting paired t-tests on scores of each model, we confirmed that the suggested model shows better performance in 26 tests out of 30. In addition, the suggested model was the best model in terms of accuracy. This research proposes evaluation criteria extracting method that combines topic extraction using LDA and refinement with k-nearest neighbor approach. This method overcomes the limits of previous dictionary-based models and frequency-based refinement models. This study can contribute to improve review analysis for deriving business insights in e-commerce market.

The Influence Evaluation of $^{201}Tl$ Myocardial Perfusion SPECT Image According to the Elapsed Time Difference after the Whole Body Bone Scan (전신 뼈 스캔 후 경과 시간 차이에 따른 $^{201}Tl$ 심근관류 SPECT 영상의 영향 평가)

  • Kim, Dong-Seok;Yoo, Hee-Jae;Ryu, Jae-Kwang;Yoo, Jae-Sook
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.14 no.1
    • /
    • pp.67-72
    • /
    • 2010
  • Purpose: In Asan Medical Center we perform myocardial perfusion SPECT to evaluate cardiac event risk level for non-cardiac surgery patients. In case of patients with cancer, we check tumor metastasis using whole body bone scan and whole body PET scan and then perform myocardial perfusion SPECT to reduce unnecessary exam. In case of short term in patients, we perform $^{201}Tl$ myocardial perfusion SPECT after whole body bone scan a minimum 16 hours in order to reduce hospitalization period but it is still the actual condition in which the evaluation about the affect of the crosstalk contamination due to the each other dissimilar isotope administration doesn't properly realize. So in our experiments, we try to evaluate crosstalk contamination influence on $^{201}Tl$ myocardial perfusion SPECT using anthropomorphic torso phantom and patient's data. Materials and Methods: From 2009 August to September, we analyzed 87 patients with $^{201}Tl$ myocardial perfusion SPECT. According to $^{201}Tl$ myocardial perfusion SPECT yesterday whole body bone scan possibility of carrying out, a patient was classified. The image data are obtained by using the dual energy window in $^{201}Tl$ myocardial perfusion SPECT. We analyzed $^{201}Tl$ and $^{99m}Tc$ counts ratio in each patients groups obtained image data. We utilized anthropomorphic torso phantom in our experiment and administrated $^{201}Tl$ 14.8 MBq (0.4 mCi) at myocardium and $^{99m}Tc$ 44.4 MBq (1.2 mCi) at extracardiac region. We obtained image by $^{201}Tl$ myocardial perfusion SPECT without gate method application and analyzed spatial resolution using Xeleris ver 2.0551. Results: In case of $^{201}Tl$ window and the counts rate comparison result yesterday whole body bone scan of being counted in $^{99m}Tc$ window, the difference in which a rate to 24 hours exponential-functionally notes in 1:0.114 with Ventri (GE Healthcare, Wisconsin, USA), 1:0.249 after the bone tracer injection in 12 hours in 1:0.411 with 1:0.79 with Infinia (GE healthcare, Wisconsin, USA) according to a reduction a time-out was shown (Ventri p=0.001, Infinia p=0.001). Moreover, the rate of the case in which it doesn't perform the whole body bone scan showed up as the average 1:$0.067{\pm}0.6$ of Ventri, and 1:$0.063{\pm}0.7$ of Infinia. According to the phantom after experiment spatial resolution measurement result, and an addition or no and time-out of $^{99m}Tc$ administrated, it doesn't note any change of FWHM (p=0.134). Conclusion: Through the experiments using anthropomorphic torso phantom and patients data, we found that $^{201}Tl$ myocardium perfusion SPECT image later carried out after the bone tracer injection with 16 hours this confirmed that it doesn't receive notable influence in spatial resolution by $^{99m}Tc$. But this investigation is only aimed to image quality, so it needs more investigation in patient's radiation dose and exam accuracy and precision. The exact guideline presentation about the exam interval should be made of the validation test which is exact and in which it is standardized about the affect of the crosstalk contamination according to the isotope use in which it is different later on.

  • PDF

The Classification System and Information Service for Establishing a National Collaborative R&D Strategy in Infectious Diseases: Focusing on the Classification Model for Overseas Coronavirus R&D Projects (국가 감염병 공동R&D전략 수립을 위한 분류체계 및 정보서비스에 대한 연구: 해외 코로나바이러스 R&D과제의 분류모델을 중심으로)

  • Lee, Doyeon;Lee, Jae-Seong;Jun, Seung-pyo;Kim, Keun-Hwan
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.3
    • /
    • pp.127-147
    • /
    • 2020
  • The world is suffering from numerous human and economic losses due to the novel coronavirus infection (COVID-19). The Korean government established a strategy to overcome the national infectious disease crisis through research and development. It is difficult to find distinctive features and changes in a specific R&D field when using the existing technical classification or science and technology standard classification. Recently, a few studies have been conducted to establish a classification system to provide information about the investment research areas of infectious diseases in Korea through a comparative analysis of Korea government-funded research projects. However, these studies did not provide the necessary information for establishing cooperative research strategies among countries in the infectious diseases, which is required as an execution plan to achieve the goals of national health security and fostering new growth industries. Therefore, it is inevitable to study information services based on the classification system and classification model for establishing a national collaborative R&D strategy. Seven classification - Diagnosis_biomarker, Drug_discovery, Epidemiology, Evaluation_validation, Mechanism_signaling pathway, Prediction, and Vaccine_therapeutic antibody - systems were derived through reviewing infectious diseases-related national-funded research projects of South Korea. A classification system model was trained by combining Scopus data with a bidirectional RNN model. The classification performance of the final model secured robustness with an accuracy of over 90%. In order to conduct the empirical study, an infectious disease classification system was applied to the coronavirus-related research and development projects of major countries such as the STAR Metrics (National Institutes of Health) and NSF (National Science Foundation) of the United States(US), the CORDIS (Community Research & Development Information Service)of the European Union(EU), and the KAKEN (Database of Grants-in-Aid for Scientific Research) of Japan. It can be seen that the research and development trends of infectious diseases (coronavirus) in major countries are mostly concentrated in the prediction that deals with predicting success for clinical trials at the new drug development stage or predicting toxicity that causes side effects. The intriguing result is that for all of these nations, the portion of national investment in the vaccine_therapeutic antibody, which is recognized as an area of research and development aimed at the development of vaccines and treatments, was also very small (5.1%). It indirectly explained the reason of the poor development of vaccines and treatments. Based on the result of examining the investment status of coronavirus-related research projects through comparative analysis by country, it was found that the US and Japan are relatively evenly investing in all infectious diseases-related research areas, while Europe has relatively large investments in specific research areas such as diagnosis_biomarker. Moreover, the information on major coronavirus-related research organizations in major countries was provided by the classification system, thereby allowing establishing an international collaborative R&D projects.

Measurements of Dissociation Enthalpy for Simple Gas Hydrates Using High Pressure Differential Scanning Calorimetry (고압 시차 주사 열량계를 이용한 단일 객체 가스 하이드레이트의 해리 엔탈피 측정)

  • Lee, Seungmin;Park, Sungwon;Lee, Youngjun;Kim, Yunju;Lee, Ju Dong;Lee, Jaehyoung;Seo, Yongwon
    • Korean Chemical Engineering Research
    • /
    • v.50 no.4
    • /
    • pp.666-671
    • /
    • 2012
  • Gas hydrates are inclusion compounds formed when small-sized guest molecules are incorporated into the well defined cages made up of hydrogen bonded water molecules. Since large masses of natural gas hydrates exist in permafrost regions or beneath deep oceans, these naturally occurring gas hydrates in the earth containing mostly $CH_4$ are regarded as future energy resources. The heat of dissociation is one of the most important thermal properties in exploiting natural gas hydrates. The accurate and direct method to measure the dissociation enthalpies of gas hydrates is to use a calorimeter. In this study, the high pressure micro DSC (Differential Scanning Calorimeter) was used to measure the dissociation enthalpies of methane, ethane, and propane hydrates. The accuracy and repeatability of the data obtained from the DSC was confirmed by measuring the dissociation enthalpy of ice. The dissociation enthalpies of methane, ethane, and propane hydrates were found to be 54.2, 73.8, and 127.7 kJ/mol-gas, respectively. For each gas hydrate, at given pressures the dissociation temperatures which were obtained in the process of enthalpy measurement were compared with three-phase (hydrate (H) - liquid water (Lw) - vapor (V)) equilibrium data in the literature and found to be in good agreement with literature values.

Mapping of the Righteous Tree Selection for a Given Site Using Digital Terrain Analysis on a Central Temperate Forest (수치지형해석(數値地形解析)에 의한 온대중부림(溫帶中部林)의 적지적수도(適地適樹圖) 작성(作成))

  • Kang, Young-Ho;Jeong, Jin-Hyun;Kim, Young-Kul;Park, Jae-Wook
    • Journal of Korean Society of Forest Science
    • /
    • v.86 no.2
    • /
    • pp.241-250
    • /
    • 1997
  • The study was conducted to make a map for selecting righteous tree species for each site by digital terrain analysis. We set an algorithmic value for each tree species' characteristics with distribution pattern analysis, and the soil types were digitized from data indicated on soil map. Mean altitude, slope, aspect and micro-topography were estimated from the digital map for each block which had been calculated by regression equations with altitude. The results obtained from the study could be summarized as follows 1. We could develope a method to select righteous tree species for a given site with concern of soil, forest condition and topographic factors on Muju-Gun in Chonbuk province(2,500ha) by the terrain analysis and multi-variate digital map with a personal computer. 2. The brown forest soils were major soil types for the study area, and 29 tree species were occurred with Pinus densiflora as a dominant species. The differences in site condition and soil properties resulted in site quality differences for each tree species. 3. We tried to figure out the accuracy of a basic program(DTM.BAS) enterprised for this study with comparing the mean altitude and aspect calculated from the topographic terrain analysis map and those from surveyed data. The differences between the values were less than 5% which could be accepted as a statistically allowable value for altitude, as well as the values for aspect showed no differences between both the mean altitude and aspect. The result may indicate that the program can be used further in efficiency. 4. From the righteous-site selection map, the 2nd group(R, $B_1$) took the largest area with 46% followed by non-forest area (L) with 23%, the 5th group with 7% and the 4th group with 5%, respectively. The other groups occupied less than 6%. 5. We suggested four types of management tools by silvicultural tree species with considering soil type and topographic conditions.

  • PDF

Three-dimensional Model Generation for Active Shape Model Algorithm (능동모양모델 알고리듬을 위한 삼차원 모델생성 기법)

  • Lim, Seong-Jae;Jeong, Yong-Yeon;Ho, Yo-Sung
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.43 no.6 s.312
    • /
    • pp.28-35
    • /
    • 2006
  • Statistical models of shape variability based on active shape models (ASMs) have been successfully utilized to perform segmentation and recognition tasks in two-dimensional (2D) images. Three-dimensional (3D) model-based approaches are more promising than 2D approaches since they can bring in more realistic shape constraints for recognizing and delineating the object boundary. For 3D model-based approaches, however, building the 3D shape model from a training set of segmented instances of an object is a major challenge and currently it remains an open problem in building the 3D shape model, one essential step is to generate a point distribution model (PDM). Corresponding landmarks must be selected in all1 training shapes for generating PDM, and manual determination of landmark correspondences is very time-consuming, tedious, and error-prone. In this paper, we propose a novel automatic method for generating 3D statistical shape models. Given a set of training 3D shapes, we generate a 3D model by 1) building the mean shape fro]n the distance transform of the training shapes, 2) utilizing a tetrahedron method for automatically selecting landmarks on the mean shape, and 3) subsequently propagating these landmarks to each training shape via a distance labeling method. In this paper, we investigate the accuracy and compactness of the 3D model for the human liver built from 50 segmented individual CT data sets. The proposed method is very general without such assumptions and can be applied to other data sets.

The evaluation for the usability ofthe Varian Standard Couch modelingusing Treatment Planning System (치료계획 시스템을 이용한 Varian Standard Couch 모델링의 유용성 평가)

  • Yang, yong mo;Song, yong min;Kim, jin man;Choi, ji min;Choi, byeung gi
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.28 no.1
    • /
    • pp.77-86
    • /
    • 2016
  • Purpose : When a radiation treatment, there is an attenuation by Carbon Fiber Couch. In this study, we tried to evaluate the usability of the Varian Standard Couch(VSC) by modeling with Treatment Planning System (TPS) Materials and Methods : VSC was scanned by CBCT(Cone Beam Computed Tomography) of the Linac(Clinac IX, VARIAN, USA), following the three conditions of VSC, Side Rail OutGrid(SROG), Side Rail InGrid(SRIG), Side Rail In OutSpine Down Bar(SRIOS). After scan, the data was transferred to TPS and modeled by contouring Side Rail, Side Bar Upper, Side Bar Lower, Spine Down Bar automatically. We scanned the Cheese Phantom(Middelton, USA) using Computed Tomography(Light Speed RT 16, GE, USA) and transfer the data to TPS, and apply VSC modeled previously with TPS to it. Dose was measured at the isocenter of Ion Chamber(A1SL, Standard imaging, USA) in Cheese Phantom using 4 and 10 MV radiation for every $5^{\circ}$ gantry angle in a different filed size($3{\times}3cm^2$, $10{\times}10cm^2$) without any change of MU(=100), and then we compared the calculated dose and measured dose. Also we included dose at the $127^{\circ}$ in SRIG to compare the attenuation by Side Bar Upper. Results : The density of VSC by CBCT in TPS was $0.9g/cm^3$, and in the case of Spine Down Bar, it was $0.7g/cm^3$. The radiation was attenuated by 17.49%, 16.49%, 8.54%, and 7.59% at the Side Rail, Side Bar Upper, Side Bar Lower, and Spine Down Bar. For the accuracy of modeling, calculated dose and measured dose were compared. The average error was 1.13% and the maximum error was 1.98% at the $170^{\circ}beam$ crossing the Spine Down Bar. Conclusion : To evaluate the usability for the VSC modeled by TPS, the maximum error was 1.98% as a result of compassion between calculated dose and measured dose. We found out that VSC modeling helped expect the dose, so we think that it will be helpful for the more accurate treatment.

  • PDF

Retrieval of Sulfur Dioxide Column Density from TROPOMI Using the Principle Component Analysis Method (주성분분석방법을 이용한 TROPOMI로부터 이산화황 칼럼농도 산출 연구)

  • Yang, Jiwon;Choi, Wonei;Park, Junsung;Kim, Daewon;Kang, Hyeongwoo;Lee, Hanlim
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.6_3
    • /
    • pp.1173-1185
    • /
    • 2019
  • We, for the first time, retrieved sulfur dioxide (SO2) vertical column density (VCD) in industrial and volcanic areas from TROPOspheric Monitoring Instrument (TROPOMI) using the Principle component analysis(PCA) algorithm. Furthermore, SO2 VCDs retrieved by the PCA algorithm from TROPOMI raw data were compared with those retrieved by the Differential Optical Absorption Spectroscopy (DOAS) algorithm (TROPOMI Level 2 SO2 product). In East Asia, where large amounts of SO2 are released to the surface due to anthropogenic source such as fossil fuels, the mean value of SO2 VCD retrieved by the PCA (DOAS) algorithm was shown to be 0.05 DU (-0.02 DU). The correlation between SO2 VCD retrieved by the PCA algorithm and those retrieved by the DOAS algorithm were shown to be low (slope = 0.64; correlation coefficient (R) = 0.51) for cloudy condition. However, with cloud fraction of less than 0.5, the slope and correlation coefficient between the two outputs were increased to 0.68 and 0.61, respectively. It means that the SO2 retrieval sensitivity to surface is reduced when the cloud fraction is high in both algorithms. Furthermore, the correlation between volcanic SO2 VCD retrieved by the PCA algorithm and those retrieved by the DOAS algorithm is shown to be high (R = 0.90) for cloudy condition. This good agreement between both data sets for volcanic SO2 is thought to be due to the higher accuracy of the satellite-based SO2 VCD retrieval for SO2 which is mainly distributed in the upper troposphere or lower stratosphere in volcanic region.