• Title/Summary/Keyword: quant

Search Result 32, Processing Time 0.026 seconds

Reproducibility of an Automatic Quantitation of Regional Myocardial Wall Motion and Systolic Thickening on Gated Tc-99m-MIBI Myocardial SPECT (게이트 Tc-99m-MIBI SPECT에서 국소 심근운동과 수축기 심근두꺼워짐 자동정량화법의 재현성)

  • Paeng, Jin-Chul;Lee, Dong-Soo;Cheon, Gi-Jeong;Kim, Yu-Kyeong;Chung, June-Key;Lee, Myung-Chul
    • The Korean Journal of Nuclear Medicine
    • /
    • v.34 no.6
    • /
    • pp.487-496
    • /
    • 2000
  • Purpose: The aim of this study is to investigate the reproducibility of the quantitative assessment of segmental wall motion and systolic thickening provided by an automatic quantitation algorithm. Materials and Methods: Tc-99m-MIBI gated myocardial SPECT with dipyridamole stress was performed in 31 patients with known or suspected coronary artery disease (4 with single, 6 with two, 11 with triple vessel disease; ejection fraction $51{\pm}14%$) twice consecutively in the same position. Myocardium was divided into 20 segments. Segmental wall motion and systolic thickening were calculated and expressed in mm and % increase respectively, using $AutoQUANT^{TM}$ software. The reproducibility of this quantitative measurement of wall motion and thickening was tested. Results: Correlations between repeated measurements on consecutive gated SPECT were excellent for wall motion (r=0.95) and systolic thickening (r=0.88). On Bland-Altman analysis, two standard deviation was 2 mm for repeated measurement of segmental wall motion, and 20% for that of systolic thickening. The weighted kappa values of repeated measurements were 0.807 for wall motion and 0.708 for systolic thickening. Sex, perfusion, or segmental location had no influence on reproducibility. Conclusion: Segmental wall motion and systolic thickening quantified using $AutoeUANT^{TM}$ software on gated myocardial SPECT offers good reproducibility and is significantly different when the change is more than 2 mm for wall motion and more than 20% for systolic thickening.

  • PDF

Assessment of Mild Cognitive Impairment in Elderly Subjects Using a Fully Automated Brain Segmentation Software

  • Kwon, Chiheon;Kang, Koung Mi;Byun, Min Soo;Yi, Dahyun;Song, Huijin;Lee, Ji Ye;Hwang, Inpyeong;Yoo, Roh-Eul;Yun, Tae Jin;Choi, Seung Hong;Kim, Ji-hoon;Sohn, Chul-Ho;Lee, Dong Young
    • Investigative Magnetic Resonance Imaging
    • /
    • v.25 no.3
    • /
    • pp.164-171
    • /
    • 2021
  • Purpose: Mild cognitive impairment (MCI) is a prodromal stage of Alzheimer's disease (AD). Brain atrophy in this disease spectrum begins in the medial temporal lobe structure, which can be recognized by magnetic resonance imaging. To overcome the unsatisfactory inter-observer reliability of visual evaluation, quantitative brain volumetry has been developed and widely investigated for the diagnosis of MCI and AD. The aim of this study was to assess the prediction accuracy of quantitative brain volumetry using a fully automated segmentation software package, NeuroQuant®, for the diagnosis of MCI. Materials and Methods: A total of 418 subjects from the Korean Brain Aging Study for Early Diagnosis and Prediction of Alzheimer's Disease cohort were included in our study. Each participant was allocated to either a cognitively normal old group (n = 285) or an MCI group (n = 133). Brain volumetric data were obtained from T1-weighted images using the NeuroQuant software package. Logistic regression and receiver operating characteristic (ROC) curve analyses were performed to investigate relevant brain regions and their prediction accuracies. Results: Multivariate logistic regression analysis revealed that normative percentiles of the hippocampus (P < 0.001), amygdala (P = 0.003), frontal lobe (P = 0.049), medial parietal lobe (P = 0.023), and third ventricle (P = 0.012) were independent predictive factors for MCI. In ROC analysis, normative percentiles of the hippocampus and amygdala showed fair accuracies in the diagnosis of MCI (area under the curve: 0.739 and 0.727, respectively). Conclusion: Normative percentiles of the hippocampus and amygdala provided by the fully automated segmentation software could be used for screening MCI with a reasonable post-processing time. This information might help us interpret structural MRI in patients with cognitive impairment.

A Study on Global Blockchain Economy Ecosystem Classification and Intelligent Stock Portfolio Performance Analysis (글로벌 블록체인 경제 생태계 분류와 지능형 주식 포트폴리오 성과 분석)

  • Kim, Honggon;Ryu, Jongha;Shin, Woosik;Kim, Hee-Woong
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.3
    • /
    • pp.209-235
    • /
    • 2022
  • Starting from 2010, blockchain technology, along with the development of artificial intelligence, has been in the spotlight as the latest technology to lead the 4th industrial revolution. Furthermore, previous research regarding blockchain's technological applications has been ongoing ever since. However, few studies have been examined the standards for classifying the blockchain economic ecosystem from a capital market perspective. Our study is classified into a collection of interviews of software developers, entrepreneurs, market participants and experts who use blockchain technology to utilize the blockchain economic ecosystem from a capital market perspective for investing in stocks, and case study methodologies of blockchain economic ecosystem according to application fields of blockchain technology. Additionally, as a way that can be used in connection with equity investment in the capital market, the blockchain economic ecosystem classification methodology was established to form an investment universe consisting of global blue-chip stocks. It also helped construct an intelligent portfolio through quantitative and qualitative analysis that are based on quant and artificial intelligence strategies and evaluate its performances. Lastly, it presented a successful investment strategy according to the growth of blockchain economic ecosystem. This study not only classifies and analyzes blockchain standardization as a blockchain economic ecosystem from a capital market, rather than a technical, point of view, but also constructs a portfolio that targets global blue-chip stocks while also developing strategies to achieve superior performances. This study provides insights that are fused with global equity investment from the perspectives of investment theory and the economy. Therefore, it has practical implications that can contribute to the development of capital markets.

Quantitative Analysis of X-Ray Fluorescence for Understanding the Effect of Elevated Temperatures on Cement Pastes (XRF (X-ray fluorescence)를 활용한 고온환경에 노출된 시멘트 페이스트 분석의 이해)

  • Kil-Song Jeon;Young-Sun Heo
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.27 no.6
    • /
    • pp.130-137
    • /
    • 2023
  • By using XRF (X-ray fluorescence), this study investigates the variation of chemical properties in cement pastes at elevated temperatures. High-temperature conditions were prepared by using an electric furnace, planning a total of 11 target temperatures ranging from room temperature to 1000 ℃. A standard library of geo-quant basic was applied for the analysis of 12 elements in cement paste, including Ca, Si, Al, Fe, S, Mg, Ti, Sr, P, Mn, Zn and K. The results revealed that, as the temperature increased, the proportion of each element in the cement paste also increased. With the exception of a few elements present in extremely low amounts in the cement pastes, the variation in the composition ratio of most elements exhibited a strong correlation with temperature, with an R-squared value exceeding 0.98. In this study, cement pastes exposed to normal and high-temperature environments were compared. The authors established that the reasons for the different results in this comparison can be explained from the same perspective as when comparing raw cement with cement paste. Furthermore, this study discussed the potentially most dominant parameter when investigating the properties of cement paste using XRF.

Agreement and Reliability between Clinically Available Software Programs in Measuring Volumes and Normative Percentiles of Segmented Brain Regions

  • Huijin Song;Seun Ah Lee;Sang Won Jo;Suk-Ki Chang;Yunji Lim;Yeong Seo Yoo;Jae Ho Kim;Seung Hong Choi;Chul-Ho Sohn
    • Korean Journal of Radiology
    • /
    • v.23 no.10
    • /
    • pp.959-975
    • /
    • 2022
  • Objective: To investigate the agreement and reliability of estimating the volumes and normative percentiles (N%) of segmented brain regions among NeuroQuant (NQ), DeepBrain (DB), and FreeSurfer (FS) software programs, focusing on the comparison between NQ and DB. Materials and Methods: Three-dimensional T1-weighted images of 145 participants (48 healthy participants, 50 patients with mild cognitive impairment, and 47 patients with Alzheimer's disease) from a single medical center (SMC) dataset and 130 participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset were included in this retrospective study. All images were analyzed with DB, NQ, and FS software to obtain volume estimates and N% of various segmented brain regions. We used Bland-Altman analysis, repeated measures ANOVA, reproducibility coefficient, effect size, and intraclass correlation coefficient (ICC) to evaluate inter-method agreement and reliability. Results: Among the three software programs, the Bland-Altman plot showed a substantial bias, the ICC showed a broad range of reliability (0.004-0.97), and repeated-measures ANOVA revealed significant mean volume differences in all brain regions. Similarly, the volume differences of the three software programs had large effect sizes in most regions (0.73-5.51). The effect size was largest in the pallidum in both datasets and smallest in the thalamus and cerebral white matter in the SMC and ADNI datasets, respectively. N% of NQ and DB showed an unacceptably broad Bland-Altman limit of agreement in all brain regions and a very wide range of ICC values (-0.142-0.844) in most brain regions. Conclusion: NQ and DB showed significant differences in the measured volume and N%, with limited agreement and reliability for most brain regions. Therefore, users should be aware of the lack of interchangeability between these software programs when they are applied in clinical practice.

A Relationship between Pop Art and Fashion in the 60's (1960년대 팝 아트(Pop Art)의 사조와 패션)

  • Kim Minja
    • Journal of the Korean Society of Clothing and Textiles
    • /
    • v.10 no.1
    • /
    • pp.69-84
    • /
    • 1986
  • The objective of this Paper was to identify the relationship between the fine arts, pop art and fashion in relation to its qualities, motifs, and techniques of graffiti and collage. The data of this study were collected from fashion magazines such as French Vogue and American Vogue from 1962 through 1970 and Elle from 1980, post cards and reports of costume exhibition in Victoria & Albert museum in London, and newspaper accounts and magazine accounts. The qualities of pop art were characterized as 1) Popular (designed for mass audience), 2) transient (short term solution), 3) expendable (easily forgotten), 4) low cost, 5) mass produced, 6) young (aimed at youth), 7) witty, 8) sexy and erotic, ana 9) big business. Pop art was rooted in urban environment. According to analysis of the data for this paper, these special aspects of that environment reflected on fashion in the 60's. Mary Quant, Zandra Rhodes, Y.S.L., Rudi Gernreich, Paco Rabanne, Pierre Cardin, Andre Courreges in the 60's and Castelbajac and Sprouse in the 80's showed Pop art dresses, mods fashion inspired by pop artists such as Hamilton, Donaldson, Allen Jones, Jasper Jones, Andy Wahol, and Keith Haring. New erotism of fashion was Produced by Y.S.L.'s see-through blouse, Courreges'a hipster pants, and Gernreich's bikinis which revealed the navel and the breast. T-shirts and dresses ornamented with Pop idols' faces, Pop graffitic motifs, and slogans, as a resistant to society, were begun to popular.

  • PDF

Analysis of G3BP1 and VEZT Expression in Gastric Cancer and Their Possible Correlation with Tumor Clinicopathological Factors

  • Beheshtizadeh, Mohammadreza;Moslemi, Elham
    • Journal of Gastric Cancer
    • /
    • v.17 no.1
    • /
    • pp.43-51
    • /
    • 2017
  • Purpose: This study aimed to analyze G3BP1 and VEZT expression profiles in patients with gastric cancer, and examine the possible relationship between the expressions of each gene and clinicopathological factors. Materials and Methods: Expression of these genes in formalin-fixed paraffin embedded (FFPE) tissues, collected from 40 patients with gastric cancer and 40 healthy controls, was analyzed. Differences in gene expression among patient and normal samples were identified using the GraphPad Prism 5 software. For the analysis of real-time polymerase chain reaction products, GelQuantNET software was used. Results: Our findings demonstrated that both VEZT and G3BP1 mRNA expression levels were downregulated in gastric cancer samples compared with those in the normal controls. No significant relationship was found between the expression of these genes and gender (P-value, 0.4835 vs. 0.6350), but there were significant changes associated with age (P-value, 0.0004 vs. 0.0001) and stage of disease (P-value, 0.0019 vs. 0.0001). In addition, there was a direct relationship between VEZT gene expression and metastasis (P-value, 0.0462), in contrast to G3BP1 that did not demonstrate any significant correlation (P-value, 0.1833). Conclusions: The results suggest that expression profiling of VEZT and G3BP1 can be used for diagnosis of gastric cancer, and specifically, VEZT gene could be considered as a biomarker for the detection of gastric cancer progression.

The Quantitative Analysis of SB Latex Contents in Coating Color and Coating Layer of Coated Paper Using FT/Raman Spectroscopy (FT/RAman을 이용한 도공액과 도공지의 도공층 내의 SB Latex 정량분석)

  • 이복진;정순기;윤동호;마금자
    • Journal of Korea Technical Association of The Pulp and Paper Industry
    • /
    • v.31 no.4
    • /
    • pp.16-22
    • /
    • 1999
  • The quantitative analysis of SB latex contents in coating color and coated paper was investigated with FT/Raman spectroscopy. From the measured FT/IR and FT/Ramon spectra, the peaks of coating color were compared with those of each compoents . Calibration curves were obtained by the area of latex peaks and PLS method of QuantIR program. The relation of predicted values in PLS method and actual values in coating mixtures and coating layer was examined. The components of coating layer in coated paper were investigated by EDS , X-mapping and SEM, The contents of latex in z-direction were calculated in the coating layer of unknown coated paper. The latex concentration measurements of Top layer and Pre layer in double coated paper show that each layer has different value. In single coated paper, it is clear that the latex concentration is highest at the surface and decreases with an increase of depth. From those results it is indicated that the latex migrates to the coated surface. The result of this study may be applied to the binder migration study and the quality control in paper mill.

  • PDF

A Case Study on the Establishment of an Equity Investment Optimization Model based on FinTech: For Institutional Investors (핀테크 기반 주식투자 최적화 모델 구축 사례 연구 : 기관투자자 대상)

  • Kim, Hong Gon;Kim, Sodam;Kim, Hee-Wooong
    • Knowledge Management Research
    • /
    • v.19 no.1
    • /
    • pp.97-118
    • /
    • 2018
  • The finance-investment industry is currently focusing on research related to artificial intelligence and big data, moving beyond conventional theories of financial engineering. However, the case of equity optimization portfolio by using an artificial intelligence, big data, and its performance is rarely realized in practice. Thus, the purpose of this study is to propose process improvements in equity selection, information analysis, and portfolio composition, and lastly an improvement in portfolio returns, with the case of an equity optimization model based on quantitative research by an artificial intelligence. This paper is an empirical study of the portfolio based on an artificial intelligence technology of "D" asset management, which is the largest domestic active-quant-fiduciary management in accordance with the purpose of this paper. This study will apply artificial intelligence to finance, analyzing financial and demand-supply information and automating factor-selection and weight of equity through machine learning based on the artificial neural network. Also, the learning the process for the composition of portfolio optimization and its performance by applying genetic algorithms to models will be documented. This study posits a model that the asset management industry can achieve, with continuous and stable excess performance, low costs and high efficiency in the process of investment.

Validation of Reduced-volume Reaction in the PowerQuant® System for human DNA Quantification

  • Kim, Hyojeong;Cho, Yoonjung;Kim, Jeongyong;Lee, Ja Hyun;Kim, Hyo Sook;Kim, Eungsoo
    • Biomedical Science Letters
    • /
    • v.26 no.4
    • /
    • pp.275-287
    • /
    • 2020
  • Since its introduction in the forensic field, quantitative PCR (qPCR) has played an essential role in DNA analysis. Quality of DNA should be evaluated before short tandem repeat (STR) profiling to obtain reliable results and reduce unnecessary costs. To this end, various human DNA quantification kits have been developed. Among these kits, the PowerQunat® System was designed not only to determine the total amount of human DNA and human male DNA from a forensic evidence item, but also to offer data about degradation of DNA samples. However, a crucial limitation of the PowerQunat® System is its high cost. Therefore, to minimize the cost of DNA quantification, we evaluated kit performance using a reduced volume of reagents (1/2-volume) using DNA samples of varying types and concentrations. Our results demonstrated that the low-volume method has almost comparable performance to the manufacturer's method for human DNA quantification, human male DNA quantification, and DNA degradation index. Furthermore, using a reduced volume of regents, it is possible to run 2 times more reactions per kit. We expect the proposed low-volume method to cut costs in half for laboratories dealing with large numbers of DNA samples.