• Title/Summary/Keyword: Automatic Analysis

Search Result 3,205, Processing Time 0.034 seconds

Optimized Methods of Preimplantation Genetic Diagnosis for Trinucleotide Repeat Diseases of Huntington's Disease, Spinocerebellar Ataxia 3 and Fragile X Syndrome (삼핵산 반복서열 질환인 헌팅톤병, 척수소뇌성 운동실조증, X-염색체 취약 증후군의 착상전 유전진단 방법에 대한 연구)

  • Kim, Min-Jee;Lee, Hyoung-Song;Lim, Chun-Kyu;Cho, Jae-Won;Kim, Jin-Young;Koong, Mi-Kyoung;Son, In-Ok;Kang, Inn-Soo;Jun, Jin-Hyon
    • Clinical and Experimental Reproductive Medicine
    • /
    • v.34 no.3
    • /
    • pp.179-188
    • /
    • 2007
  • Objectives: Many neurological diseases are known to be caused by expansion of trinucleotide repeats (TNRs). It is hard to diagnose the alteration of TNRs with single cell level for preimplantation genetic diagnosis (PGD). In this study, we describe methods optimized for PGD of TNRs related diseases such as Huntington's disease (HD), spinocerebellar ataxia 3 (SCA3) and fragile X syndrome (FXS). Methods: We performed the preclinical assays with heterozygous patient's lymphocytes by single cell PCR strategy. Fluorescent semi-nested PCR and fragment analysis using automatic genetic analyzer were applied for HD and SCA 3. Whole genome amplification with multiple displacement amplification (MDA) method and fluorescent PCR were carried out for FXS. Amplification and allele drop-out (ADO) rate were evaluated in each case. Results: The fluorescent semi-nested PCR of single lymphocyte showed 100.0% of amplification and 14.0% of ADO rate in HD, and 94.7% of amplification and 5.6% of ADO rate in SCA3, respectively. We could not detect the PCR product of CGG repeats in FXS using the fluorescent semi-nested PCR alone. After applying the MDA method in FXS, 84.2% of amplification and 31.3% of ADO rate were achieved. Conclusions: Fluorescent semi-nested PCR is a reliable method for PGD of HD and SCA3. The advanced MDA method overcomes the problem of amplification failure in CGG repeats of FXS case. Optimization of methods for single cell analysis could improve the sensitivity and reliability of PGD for complicated single gene disorders of TNRs.

The Correlation Analysis of Ejection Fraction: Comparison of $^{201}Tl$ gated Myocardial Perfusion SPECT and Echocardiography ($^{201}Tl$ 게이트 심근관류 SPECT 및 심초음파의 좌심실 구혈률 상관관계 비교)

  • Yoon, Soon Sang;Ryu, Jae Kwang;Cha, Min Kyung;Lee, Jong Hun;Kim, Sung Hwan
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.16 no.2
    • /
    • pp.49-56
    • /
    • 2012
  • Purpose : Gated myocardial perfusion SPECT provides not only myocardial perfusion status, but various functional parameters of left ventricle (LV). The purpose of this study was to analyze ejection fraction (EF) for correlation and difference between $^{201}Tl$ gated myocardial perfusion SPECT and echocardiography depending on extent of perfusion defect, gender and LV volumes. Materials and Methods : From April 2011 to May 2012, we analyzed 291 patients (male:female =165:126; mean: $64.6{\pm}10.8$ years) who were examined both $^{201}Tl$ gated myocardial perfusion SPECT and echocardiography at less than 7 days apart in our hospital. 101 patients showed perfusion defect and the rest of the people without any defect. We applied automatic analysis (Quantitative gated SPECT, QGS), and calculated EF, End-diastolic volume (EDV) and End-systolic volume (ESV) from Stress (G-Stress) and Rest (G-Rest) studies. And we analyzed the correlation and difference for EF between $^{201}Tl$ gated SPECT and echocardiography. Results : The correlation of LVEF among G-Stress, G-Rest and echocardiography was quite a good (G-Stress vs. G-Rest: r=0.909, G-Stress vs. echocardiography: r=0.833, G-Rest vs. echocardiography: r=0.825). And there were significant differences in EDV, ESV and EF in total patients (p<0.01). The normal group showed significant difference in EF (p<0.01) and the group with perfusion defect also demonstrated significant difference (a group with reversible defect: p<0.01, fixed defect: p<0.01) depending on extent of perfusion defect. We analyzed difference in normal group by gender. In normal group, there was no significant difference (p>0.05) in EF from men. However, there was a significant difference (p<0.01) from women. When we classified two groups by average size of EDV in Korean women, there was no significant difference in a group of above average size of EDV (p>0.05). Conclusion : When compared among Stress and Rest of $^{201}Tl$ gated SPECT and echocardiography, we confirmed that there was a good correlation for LVEF. But there were significant differences among three studies. And extent of perfusion defect, gender and LV volumes are independent determinants of the accuracy of LVEF. So, it is hard to compare and interchange quantitative indices among modalities. We should take additional researches to prove results of our study.

  • PDF

Accuracy Evaluation of Tumor Therapy during Respiratory Gated Radiation Therapy (호흡동조방사선 치료 시 종양 치료의 정확도 평가)

  • Jang, Eun-Sung;Kang, Soo-Man;Lee, Chol-Soo;Kang, Se-Sik
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.22 no.2
    • /
    • pp.113-122
    • /
    • 2010
  • Purpose: To evaluate the accuracy of a target position at static and dynamic state by using Dynamic phantom for the difference between tumor's actual movement during respiratory gated radiation therapy and skin movement measured by RPM (Real-time Position Management). Materials and Methods: It self-produced Dynamic phantom that moves two-dimensionally to measure a tumor moved by breath. After putting marker block on dynamic phantom, it analyzed the amplitude and status change depending on respiratory time setup in advance by using RPM. It places marker block on dynamic phantom based on this result, inserts Gafchromic EBT film into the target, and investigates 5 Gy respectively at static and dynamic state. And it scanned investigated Gafchromic EBT film and analyzed dose distribution by using automatic calculation. Results: As a result of an analysis of Gafchromic EBT film's radiation amount at static and dynamic state, it could be known that dose distribution involving 90% is distributed within margin of error of 3 mm. Conclusion: As a result of an analysis of dose distribution's change depending on patient's respiratory cycle during respiratory gated radiation therapy, it is expected that the treatment would be possible within recommended margin of error at ICRP 60.

  • PDF

Consideration of density matching technique of the plate type direct radiologic image system and the conventional X-ray film;first step for the subtraction (Ektaspeed plus 필름을 이용한 일반 방사선시스템과 Digora를 이용한 디지탈 영상시스템의 밀도변화 비교연구)

  • So, Sung-Soo;Noh, Hyeun-Soo;Kim, Chang-Sung;Choi, Seong-Ho;Kim, Kee-Deog;Cho, Kyoo-Sung
    • Journal of Periodontal and Implant Science
    • /
    • v.32 no.1
    • /
    • pp.199-211
    • /
    • 2002
  • Digital substraction technique and computer-assisted densitometirc analysis detect minor change in bone density and thus increase the diagnostic accuracy. This advantage as well as high sensitivity and objectivity which precludes human bias have drawn interest in radiologic research area. The objectives of this study are to verify if Radiographic density can be recognized in linear pattern when density profile of standard periapical radiograph with the aluminium stepwedge as the reference, was investigated under varies circumstances which can be encountered in clinical situations, and in addition to that to obtain mutual relationship between the existing standard radiographic system, and future digital image systems, by confirming the corelationship between the standard radiograph and Digora system which is a digital image system currently being used. In order to make quantitative analysis of the bone tissue, digital image system which uses high resolution automatic slide scanner as an input device, and Digora system were compared and analyzed using multifunctional program, Brain3dsp. The following conclusions were obtained. 1. Under common clinical situation that is 70kVp, 0.2 sec., and focal distance 10cm, Al-Equivalent image equation was found to be Y=11.21X+46.62 $r^2=0.9898$ in standard radiographic system, and Y=12.68X+74.59, $r^2=0.9528$ in Digora system, and linear relation was confirmed in both the systems. 2. In standard radiographic system, when all conditions were maintained the same except for the condition of developing solution, Al-Equivalent image equation was Y=10.07X+41.64, $r^2=0.9861$ which shows high corelationship. 3. When all conditions were maintained the same except for the Kilovoltage peak, linear relationship was still maintained under 60kVp, and Al-Equivalent image equation was Y=14.60X+68.86, $r^2=0.9886$ in the standard radiograhic system, and Y=13.90X+80.68, $r^2=0.9238$ in Digora system. 4. When all conditions were maintained the same except for the exposure time which was varied from 0.01 sec. to 0.8 sec., Al-Equivalent image equation was found to be linear in both the standard radiographic system and Digora system. The R-square was distributed from 0.9188 to 0.9900, and in general, standard radiographic system showed higher R-square than Digora system. 5. When all conditions were maintained the same except for the focal distance which was varied from 5cm to 30cm, Al-Equivalent image equation was found to be linear in both the standard radiographic system and Digora system. The R-square was distributed from 0.9463 to 0.9925, and the standard radiographic system had the tendency to show higher R-square in shorter focal distances.

A Template-based Interactive University Timetabling Support System (템플릿 기반의 상호대화형 전공강의시간표 작성지원시스템)

  • Chang, Yong-Sik;Jeong, Ye-Won
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.121-145
    • /
    • 2010
  • University timetabling depending on the educational environments of universities is an NP-hard problem that the amount of computation required to find solutions increases exponentially with the problem size. For many years, there have been lots of studies on university timetabling from the necessity of automatic timetable generation for students' convenience and effective lesson, and for the effective allocation of subjects, lecturers, and classrooms. Timetables are classified into a course timetable and an examination timetable. This study focuses on the former. In general, a course timetable for liberal arts is scheduled by the office of academic affairs and a course timetable for major subjects is scheduled by each department of a university. We found several problems from the analysis of current course timetabling in departments. First, it is time-consuming and inefficient for each department to do the routine and repetitive timetabling work manually. Second, many classes are concentrated into several time slots in a timetable. This tendency decreases the effectiveness of students' classes. Third, several major subjects might overlap some required subjects in liberal arts at the same time slots in the timetable. In this case, it is required that students should choose only one from the overlapped subjects. Fourth, many subjects are lectured by same lecturers every year and most of lecturers prefer the same time slots for the subjects compared with last year. This means that it will be helpful if departments reuse the previous timetables. To solve such problems and support the effective course timetabling in each department, this study proposes a university timetabling support system based on two phases. In the first phase, each department generates a timetable template from the most similar timetable case, which is based on case-based reasoning. In the second phase, the department schedules a timetable with the help of interactive user interface under the timetabling criteria, which is based on rule-based approach. This study provides the illustrations of Hanshin University. We classified timetabling criteria into intrinsic and extrinsic criteria. In intrinsic criteria, there are three criteria related to lecturer, class, and classroom which are all hard constraints. In extrinsic criteria, there are four criteria related to 'the numbers of lesson hours' by the lecturer, 'prohibition of lecture allocation to specific day-hours' for committee members, 'the number of subjects in the same day-hour,' and 'the use of common classrooms.' In 'the numbers of lesson hours' by the lecturer, there are three kinds of criteria : 'minimum number of lesson hours per week,' 'maximum number of lesson hours per week,' 'maximum number of lesson hours per day.' Extrinsic criteria are also all hard constraints except for 'minimum number of lesson hours per week' considered as a soft constraint. In addition, we proposed two indices for measuring similarities between subjects of current semester and subjects of the previous timetables, and for evaluating distribution degrees of a scheduled timetable. Similarity is measured by comparison of two attributes-subject name and its lecturer-between current semester and a previous semester. The index of distribution degree, based on information entropy, indicates a distribution of subjects in the timetable. To show this study's viability, we implemented a prototype system and performed experiments with the real data of Hanshin University. Average similarity from the most similar cases of all departments was estimated as 41.72%. It means that a timetable template generated from the most similar case will be helpful. Through sensitivity analysis, the result shows that distribution degree will increase if we set 'the number of subjects in the same day-hour' to more than 90%.

Analysis of Building Characteristics and Temporal Changes of Fire Alarms (건물 특성과 시간적 변화가 소방시설관리시스템의 화재알람에 미치는 영향 분석 연구)

  • Lim, Gwanmuk;Ko, Seoltae;Kim, Yoosin;Park, Keon Chul
    • Journal of Internet Computing and Services
    • /
    • v.22 no.4
    • /
    • pp.83-98
    • /
    • 2021
  • The purpose of this study to find the factors influencing the fire alarms using IoT firefighting facility management system data of Seoul Fire & Disaster Headquarters, and to present academic implications for establishing an effective prevention system of fire situation. As the number of high and complex buildings increases and former bulidings are advanced, the fire detection facilities that can quickly respond to emergency situations are also increasing. However, if the accuracy of the fire situation is incorrectly detected and the accuracy is lowered, the inconvenience of the residents increases and the reliability decreases. Therefore, it is necessary to improve accuracy of the system through efficient inspection and the internal environment investigation of buildings. The purpose of this study is to find out that false detection may occur due to building characteristics such as usage or time, and to aim of emphasizing the need for efficient system inspection and controlling the internal environment. As a result, it is found that the size(total area) of the building had the greatest effect on the fire alarms, and the fire alarms increased as private buildings, R-type receivers, and a large number of failure or shutoff days. In addition, factors that influencing fire alarms were different depending on the main usage of the building. In terms of time, it was found to follow people's daily patterns during weekdays(9 am to 6 pm), and each peaked around 10 am and 2 pm. This study was claimed that it is necessary to investigate the building environment that caused the fire alarms, along with the system internal inspection. Also, it propose additional recording of building environment data in real-time for follow-up research and system enhancement.

A Study on the Effect of the Document Summarization Technique on the Fake News Detection Model (문서 요약 기법이 가짜 뉴스 탐지 모형에 미치는 영향에 관한 연구)

  • Shim, Jae-Seung;Won, Ha-Ram;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.201-220
    • /
    • 2019
  • Fake news has emerged as a significant issue over the last few years, igniting discussions and research on how to solve this problem. In particular, studies on automated fact-checking and fake news detection using artificial intelligence and text analysis techniques have drawn attention. Fake news detection research entails a form of document classification; thus, document classification techniques have been widely used in this type of research. However, document summarization techniques have been inconspicuous in this field. At the same time, automatic news summarization services have become popular, and a recent study found that the use of news summarized through abstractive summarization has strengthened the predictive performance of fake news detection models. Therefore, the need to study the integration of document summarization technology in the domestic news data environment has become evident. In order to examine the effect of extractive summarization on the fake news detection model, we first summarized news articles through extractive summarization. Second, we created a summarized news-based detection model. Finally, we compared our model with the full-text-based detection model. The study found that BPN(Back Propagation Neural Network) and SVM(Support Vector Machine) did not exhibit a large difference in performance; however, for DT(Decision Tree), the full-text-based model demonstrated a somewhat better performance. In the case of LR(Logistic Regression), our model exhibited the superior performance. Nonetheless, the results did not show a statistically significant difference between our model and the full-text-based model. Therefore, when the summary is applied, at least the core information of the fake news is preserved, and the LR-based model can confirm the possibility of performance improvement. This study features an experimental application of extractive summarization in fake news detection research by employing various machine-learning algorithms. The study's limitations are, essentially, the relatively small amount of data and the lack of comparison between various summarization technologies. Therefore, an in-depth analysis that applies various analytical techniques to a larger data volume would be helpful in the future.

Topic Modeling Insomnia Social Media Corpus using BERTopic and Building Automatic Deep Learning Classification Model (BERTopic을 활용한 불면증 소셜 데이터 토픽 모델링 및 불면증 경향 문헌 딥러닝 자동분류 모델 구축)

  • Ko, Young Soo;Lee, Soobin;Cha, Minjung;Kim, Seongdeok;Lee, Juhee;Han, Ji Yeong;Song, Min
    • Journal of the Korean Society for information Management
    • /
    • v.39 no.2
    • /
    • pp.111-129
    • /
    • 2022
  • Insomnia is a chronic disease in modern society, with the number of new patients increasing by more than 20% in the last 5 years. Insomnia is a serious disease that requires diagnosis and treatment because the individual and social problems that occur when there is a lack of sleep are serious and the triggers of insomnia are complex. This study collected 5,699 data from 'insomnia', a community on 'Reddit', a social media that freely expresses opinions. Based on the International Classification of Sleep Disorders ICSD-3 standard and the guidelines with the help of experts, the insomnia corpus was constructed by tagging them as insomnia tendency documents and non-insomnia tendency documents. Five deep learning language models (BERT, RoBERTa, ALBERT, ELECTRA, XLNet) were trained using the constructed insomnia corpus as training data. As a result of performance evaluation, RoBERTa showed the highest performance with an accuracy of 81.33%. In order to in-depth analysis of insomnia social data, topic modeling was performed using the newly emerged BERTopic method by supplementing the weaknesses of LDA, which is widely used in the past. As a result of the analysis, 8 subject groups ('Negative emotions', 'Advice and help and gratitude', 'Insomnia-related diseases', 'Sleeping pills', 'Exercise and eating habits', 'Physical characteristics', 'Activity characteristics', 'Environmental characteristics') could be confirmed. Users expressed negative emotions and sought help and advice from the Reddit insomnia community. In addition, they mentioned diseases related to insomnia, shared discourse on the use of sleeping pills, and expressed interest in exercise and eating habits. As insomnia-related characteristics, we found physical characteristics such as breathing, pregnancy, and heart, active characteristics such as zombies, hypnic jerk, and groggy, and environmental characteristics such as sunlight, blankets, temperature, and naps.

Development of System for Real-Time Object Recognition and Matching using Deep Learning at Simulated Lunar Surface Environment (딥러닝 기반 달 표면 모사 환경 실시간 객체 인식 및 매칭 시스템 개발)

  • Jong-Ho Na;Jun-Ho Gong;Su-Deuk Lee;Hyu-Soung Shin
    • Tunnel and Underground Space
    • /
    • v.33 no.4
    • /
    • pp.281-298
    • /
    • 2023
  • Continuous research efforts are being devoted to unmanned mobile platforms for lunar exploration. There is an ongoing demand for real-time information processing to accurately determine the positioning and mapping of areas of interest on the lunar surface. To apply deep learning processing and analysis techniques to practical rovers, research on software integration and optimization is imperative. In this study, a foundational investigation has been conducted on real-time analysis of virtual lunar base construction site images, aimed at automatically quantifying spatial information of key objects. This study involved transitioning from an existing region-based object recognition algorithm to a boundary box-based algorithm, thus enhancing object recognition accuracy and inference speed. To facilitate extensive data-based object matching training, the Batch Hard Triplet Mining technique was introduced, and research was conducted to optimize both training and inference processes. Furthermore, an improved software system for object recognition and identical object matching was integrated, accompanied by the development of visualization software for the automatic matching of identical objects within input images. Leveraging satellite simulative captured video data for training objects and moving object-captured video data for inference, training and inference for identical object matching were successfully executed. The outcomes of this research suggest the feasibility of implementing 3D spatial information based on continuous-capture video data of mobile platforms and utilizing it for positioning objects within regions of interest. As a result, these findings are expected to contribute to the integration of an automated on-site system for video-based construction monitoring and control of significant target objects within future lunar base construction sites.

Utilization of Smart Farms in Open-field Agriculture Based on Digital Twin (디지털 트윈 기반 노지스마트팜 활용방안)

  • Kim, Sukgu
    • Proceedings of the Korean Society of Crop Science Conference
    • /
    • 2023.04a
    • /
    • pp.7-7
    • /
    • 2023
  • Currently, the main technologies of various fourth industries are big data, the Internet of Things, artificial intelligence, blockchain, mixed reality (MR), and drones. In particular, "digital twin," which has recently become a global technological trend, is a concept of a virtual model that is expressed equally in physical objects and computers. By creating and simulating a Digital twin of software-virtualized assets instead of real physical assets, accurate information about the characteristics of real farming (current state, agricultural productivity, agricultural work scenarios, etc.) can be obtained. This study aims to streamline agricultural work through automatic water management, remote growth forecasting, drone control, and pest forecasting through the operation of an integrated control system by constructing digital twin data on the main production area of the nojinot industry and designing and building a smart farm complex. In addition, it aims to distribute digital environmental control agriculture in Korea that can reduce labor and improve crop productivity by minimizing environmental load through the use of appropriate amounts of fertilizers and pesticides through big data analysis. These open-field agricultural technologies can reduce labor through digital farming and cultivation management, optimize water use and prevent soil pollution in preparation for climate change, and quantitative growth management of open-field crops by securing digital data for the national cultivation environment. It is also a way to directly implement carbon-neutral RED++ activities by improving agricultural productivity. The analysis and prediction of growth status through the acquisition of the acquired high-precision and high-definition image-based crop growth data are very effective in digital farming work management. The Southern Crop Department of the National Institute of Food Science conducted research and development on various types of open-field agricultural smart farms such as underground point and underground drainage. In particular, from this year, commercialization is underway in earnest through the establishment of smart farm facilities and technology distribution for agricultural technology complexes across the country. In this study, we would like to describe the case of establishing the agricultural field that combines digital twin technology and open-field agricultural smart farm technology and future utilization plans.

  • PDF